Notes taken between: 02-08 February 2019
on-demand no idle time penalties - cost model resembles utility instead of commodity model. cold start - like an uber. You must wait for it. For ec2 instances we hold things in state. lambda's die after request It cant keep state.
Before the cloud data centre costs were: employees, software for servers to communicate, electricity, cooling, infra employees seek, months to get new (blade) servers
- pay as you go, pay for idle time.
server is now an operational cost ( not a depreciating asset anymore)
ec2 virtual server - remote into it.
Use a lambda if you dont need a server sitting there. free tier 400,000 gbs each month 512mb function, 3mill times/month = avg 1 sec - 18.34usd$ month
1000 ec2s - seek simon wardley resilience engineering
infra as code use aws resoucres as code - eg serverless == lambda
can run max15minutes- 1000 concurent invocations todo: alexa skill
eventdriven architecture - archi pattern - modularised sep -decouple systems ( supports agile working)
s3 name: globally unique only make public if you want it to be a site seek has their own CRA - called skew.
- seek 1000 ec2 running
seek example quarantined s3 buckets to scan resumes on upload. That upload event triggers lambda function to virus check the file.
lambdas - need api gateway to receive traffic. websocket - keep alive connection - doesnt need to constantly make requests. cors - tickings cors means allow requests to come from everywhere dont let requests from diff urls. the api can state what urls can request from it
api gateway is the proxy to our lambda
You can deploy and have dep and production
serverless deploy --stage prod —verbose
— sets up 2 stacks!
Use lambda layers to group your common dependencies.