Help with the cloud architecture

0

I need some feedback on my current design.

Problem: I have 5 Python scripts which need to be executed based on user selection; I also have a frontend where the user can see the list of scripts and choose which one to run; the user will give some input (strings) which will be given to script and the script start, they ran for about 3 min and sends the response back to the user. Some of the scripts may require input multiple times; a user at the front end will be notified, and he will enter the input( strings), and the script will continue.

Solution 1: I am using Nomad with a client with 8GB RAM. I have made a docker of all five scripts. I am also using a redis which allows input to be moved from the backend to this docker. So basically, every time a user opens the browser tab with script selection, I ask the Nomad to run the job and keep waiting for input; user input is taken by the backend and placed in redis. The script then fetches input, which can be fetched multiple times. The jobs remain active until the user closes the tab for that script. As soon as the tab is closed, the nomad job is killed.

Solution 2: Combine all the scripts into a single docker (which script to run on which user command is handled internally) and run two instances of that job using Nomad. Whenever a user request script execution, we take the input request from the user and put it in redis, and then it is fetched by one of the two running jobs; once the job is started, a new instance is started (Not sure how to achieve this, I am thinking to make jobs in Nomad as a batch so they stop after executing and every time a script starts execution a new job is started and it keeps on waiting from redis and other job is completed and stopped), basically meaning for N users there will be N+2 instance of job running to reduce the waiting time, but we don't need to keep waiting for the user browser tab to close to kill a job, as soon as its work is done it is killed, and two ideal job instance constraint is maintained.

Let me know what the better way to design this is.

I am expecting ten users per day.

1 Answer
0

My impression is that Nomad is: https://www.hashicorp.com/products/nomad

I am not sure if you are specifically tied to Nomad / Docker / Redis. If so, I can't offer any insight since I have not used Nomad. The key will be which AWS services it can call / interact with.

If you are not tied to your existing stack, there are a couple of simple and proven AWS / Cloud patterns that would potentially work well for you since you are running Python scripts.

  1. Use API Gateway to host endpoints that are called when the user selects a script from your list. The APIGW is a proxy to a lambda (specifically python) to execute your script logic and return a result to the client side. (https://docs.aws.amazon.com/solutions/latest/constructs/aws-apigateway-lambda.html). The lambda can take up to 15 minutes if needed. Call the API GW from Nomad (if possible) or create a simple web page to offer the options, collect input data and call APIGW. Host web page in S3/CloudFront (https://docs.aws.amazon.com/solutions/latest/constructs/aws-cloudfront-apigateway-lambda.html).
  2. USE API GW and have it proxy to SQS to store the results (https://docs.aws.amazon.com/solutions/latest/constructs/aws-apigateway-sqs.html). Then you can have a Lambda trigger (to your Python lambdas) or docker long polling the SQS queue to process the request. Same front end as #1.

In any case, you appear to be using Redis as a queue. AWS has SQS which is a is probably a better option for this use case. But, I think you may be using Redis because it is a data source available to Nomad and SQS may not be.

answered a year ago

You are not logged in. Log in to post an answer.

A good answer clearly answers the question and provides constructive feedback and encourages professional growth in the question asker.

Guidelines for Answering Questions