I have an AWS Lambda function that is scheduled to run once an hour (as described here).
The function FTPs files from a data provider and copies them to S3.
I have a test environment, and a production environment. For each environment, the ftp address and credentials are different.
How can I configure the lambda function so it can be aware of which environment it’s running in, and get the ftp config accordingly?
The best way I can currently find to do that is as follows.
For the test version of the function, I am calling it `TEST-CopyFtpFilesToS3` and for the production version of the function I am naming the function `PRODUCTION-CopyFtpFilesToS3`. This allows me pull out the environment name using a regular expression from the environment variable `AWS_LAMBDA_FUNCTION_NAME`.
Then I am storing `config/test.json` and `config/production.json` in the zip file that I upload as code for the function. This zip file will be extracted into the directory `process.env.LAMBDA_TASK_ROOT` when the function runs. So I can load that file and get the config I need.
Some people don’t like storing the config in the code zip file, which is fine – you can just load a file from S3 or use whatever strategy you like.
Code for reading the file from the zip:
const readConfiguration = () => { return new Promise((resolve, reject) => { let environment = /^(.*?)-.*/.exec(process.env.AWS_LAMBDA_FUNCTION_NAME)[1].toLowerCase(); console.log(`environment is ${environment}`); fs.readFile(`${process.env.LAMBDA_TASK_ROOT}/config/${environment}.json`, 'utf8', function (err,data) { if (err) { reject(err); } else { var config = JSON.parse(data); console.log(`configuration is ${data}`); resolve(config); } }); }); };