I have a webapp that sits in a docker container. I am developing it on my Ubuntu laptop. The app is written in PHP and one part of it (generating reports) is very CPU intensive. It takes about 11 seconds to run a report. I have split this into 2 parts, a pre-digestion phase and a report generation phase to try to reduce this, and it has worked slightly.
The system keeps most of its data up on mysql RDS on AWS. The system allows users to upload images (contracts, patents, etc) for use in reports, after I convert them to thumbnails. Because of two libraries I use, one for the CRUD and one for the Report generation, I must keep a copy of the images on the local hard disk of each server. So my plan is to use multiple servers behind a lightsail load balancer, due to the CPU intensive nature of running a report on my system. Each server must have the same “portfolio of images” since I never know which of the servers a returning user will be assigned to, the load balancer is making that decision. Thus I am looking to sync the directories with the images each time a user logs in. An S3 bucket contains all of the images because each time a user uploads an image I upload the image along with its thumbnail to S3. Bottom line, the S3 bucket contains a copy of all of the images for the entire system. But each server must have a copy of all of the images.
So AWS support recommended your product as a possible solution. I see you have a number of different solutions. Which (if any) of your solutions would work best for me? Thx, Rich, from Old New Castle, Delaware, USA.