Cloudwatch events are your “Cloud Cron”

Many web applications need certain batch jobs to be run as per a schedule and cron is everybody’s favorite to run such scheduled jobs. In the case of small web applications, cron jobs are scheduled on the same machine where the web server is running. This works fine at testing stage. But with increased customer base and moving to an autoscaled model of Amazon Web Services, running such cron jobs becomes problematic, because individual web servers become transient and it is not clear which of the web server instances should run such jobs. Running a separate instance just for running such scheduled jobs is an overkill.
AWS Cloudwatch comes to the rescue. Apart from monitoring and logging, Cloudwatch comes with a handy event scheduling mechanism. Events can be triggered based on a cron-like schedule and these can be linked to a lambda function which can actually perform the batch job. Cost of lambda functions is based on the actual CPU time used and Lambda is a great way to avoid unnecessary cost of running an entire server just for the sake of running batch jobs.


Watch your data traffic in AWS closely

Data traffic is an important cost element on AWS. Estimating the cost you may incur due to data traffic requires a clear understanding how data traffic is metered and how various types of data traffic is measured. 2 Simple thumb rules can help you optimize your data traffic cost.


  • Design various components of your deployment in such a way that the traffic remains within a single availability zone as far as possible and to the extent it does not hurt the high availability goals.
  • Use private IPs for communication within VPC as far as possible and avoid use of Public IPs/Elastic IPs
Learn more about how pricing works on AWS at