You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Currently, all of the processing for user data is done during handling the endpoint. This is bad from a scaling standpoint. Ideally, the bare minimum should be done during the request itself (such as just a simple DB insert).
This should be off-loaded to processors running ideally on a separate process. These will be distributed over AMQP and handled. Likely will require an additional field per model dedicated to keeping track of whether these have already been processed and can be displayed to other users.
Some of the places I can see this being implemented are:
Levels:
Meili indexing
Level content analysis
Users:
Meili indexing
Stats anticheat
Stats analysis
Messages:
Spam filter
User comments:
Spam filter
The text was updated successfully, but these errors were encountered:
Currently, all of the processing for user data is done during handling the endpoint. This is bad from a scaling standpoint. Ideally, the bare minimum should be done during the request itself (such as just a simple DB insert).
This should be off-loaded to processors running ideally on a separate process. These will be distributed over AMQP and handled. Likely will require an additional field per model dedicated to keeping track of whether these have already been processed and can be displayed to other users.
Some of the places I can see this being implemented are:
Levels:
Users:
Messages:
User comments:
The text was updated successfully, but these errors were encountered: