Skip to content Skip to sidebar Skip to footer

Training Personalized Based Machine Learning Models

i work on a php project along with python which uses flask as api which predict user like on a post based on the previous engagement on other posts and its purely user based. my r

Solution 1:

I would suggest something like this:

  • Create the user models as an array (or data frame) of models
  • Save this array as a pkl
  • When loading the app (not on each API call), load the array of models into the memory
  • When an API is called, the model is already in the memory - use it to predict the result

Something like this (not tested - just a notion):

#for saving the modelmodel_data = pd.DataFrame(columns=['user','model'])
temp_model = RandomForestClassifier().fit(X,y)
new = pd.DataFrame({'user':[user_id],'model':[temp_model]})
model_data = model_data.append(new)
packed_model = jsonpickle.pickler.Pickler.flatten(model_data)

#for loading the modelunpacked_model = jsonpickle.unpickler.Unpickler.restore(packed_model) #this should be in the begining of your flask file - loaded into the memoryuser_model=unpacked_model.at(user_id,'model') #this should be inside every api call

Post a Comment for "Training Personalized Based Machine Learning Models"