Chamber š° of Tech Secrets #3
Generative AI is an unprecedented force multiplier for technologists
The Chamber of Tech Secrets is open.
Generative AI will 5-10x developer productivity
Generative AI hype is at a level I canāt recall since I have been working with technology. I find it fascinating to observe the extreme perspectives that exist on Generative AI. I sort them into two main camps:
The AI curmudgeon camp: One extreme declares Generative AI to be a trivial parlor-trick that is frequently wrong and has no real practical use. Iāll include those who are completely aware of ChatGPT (or others) but havenāt bothered experimenting because thereās too much hype and that wouldnāt be cool. Their motto: āAI has a long way to goā.
The ChatGPT will eliminate poverty, extend human life to 1000 years, and watch my baby while I go on a date night camp: The other extreme has filled twitter with optimism about a better future, much like the crypto bros in past years. Their motto: āChatGPT is the future of the galaxyā.
Of course, the vast majority of people tend to sit in the quiet and rational middle and have a less-fun-to-talk-about but more practical perspective.
I expect Generative AI to be a massive force multiplier for software engineers in the immediate term. My hypothesis is that it will 5-10x developer productivity for those that master prompt engineering.
Generating code and tests: The most obvious application is generating code at the beginning of a project (or for a new feature). My experiments with creating simple applications have worked extremely well. Just for fun, I generated a Mountain Bike app that had a Golang CRUD API and PostgreSQL database. With minimal prompting, I got something functional. I requested unit tests, which were successful as well.
Generating configuration files: I always found the generation and tweaking of configuration files to be tedious and time consuming. I asked ChatGPT to help. First I got a functional DockerFile and nice reminders about the commands to run. I also created some YAML files for deploying to Kubernetes in AWS using R53 for DNS and AWS Elastic Load Balancers via K8s controllers. Everything looks good. Now imagine how easy some of things things can get if ChatGPT has context about your environment and can generate accordinglyāperhaps via Terraform as an input?
Answering questions about code: ChatGPT has been trained on open source GitHub projects, which means its capable of answering questions about the code from open source projects (or any project its trained on).
I asked a bunch of questions about the Kubernetes code base and got some good answers sending me to the right places. While the training data is from 2021 and therefore not super useful, you can imagine a more up-to-date public version in the near future (or you could train a private version on the code bases you care about, both OSS and your own private). The ability to ask questions about a project and gain context quickly is definitely a force multiplier. āWhere is the method that does Xā or āwhat does this method do?ā or āI want to understand this project: where do I start?ā seem like no-brainer questions when you are new to a project.
Converting X to Y: We have a lot of CloudFormation scripts at work, and I was wondering how well ChatGPT could convert X to Y in the event we wanted to, say, migrate to Terraform. This worked really well. I also converted a simple app from go to rust. This is a powerful time saver when rearchitecting. I know there are some input limits via prompts which may make mass conversion of large config files or entire applications a challenge, but I expect that to get resolved in time.
Learning how to do things you donāt know how to do, and quickly: I pretended to be unfamiliar with Hashicorp Nomad and asked to deploy my MTB application there instead of K8s. ChatGPT produced a functional nomad job. It used Consul and Fabio and explained what Iād need to do if I wanted to use AWS controller components instead since it had my Kubernetes on AWS questions in historical context. Pretty cool. In my experience, you can follow any of these threads with follow up questions and get good answers if you know what to ask for.
Generating functional apps from notebook paper drawings: This one really blows me away. Its rough around the edges, but the potential to co-design with an AI that can keep context and iterate based on feedback is pretty incredible. The future will be different. Skip to 16:19 if you just want to see the notebook paper to application demo.
Serving as an embedded assistant: we are seeing an explosion of ChatGPT integrations in every tool (wonāt even bother listing them). GitHub CoPilot-X is noteworthy since there is a lot of usage in enterprises and the open source community. I havenāt had any hands on experience yet so will reserve judgement. VSCode is a hotspot as well, with helpful but still limited capabilities around all kinds of technologies including Snowflake. This is just going to continue, and the software developer will be a continued winner from the surrounding innovation.
Plugins!: The ability for ChatGPT to train on and execute plugins opens up a net chapter of possibilities in which AI can make things happen in the real world. It also cracks the door open on getting up-to-date data into the training dataset from 2021. Right now this is fairly limited to life changing things like sports scores and booking travel, but more will comeā¦and fast. The productivity boost will be real. I expect and explosion of plugins as soon as OpenAI opens the gate. I checked out the form to request access to create a plugin and they ask a lot of questions, probably for safety reasons in the early days. The fear of the Singularity is real. For those that are allowed to build a plugin, the creation process looks straightforward. Build an API that does a thing. Document using OpenAPI spec. Create JSON manifest file that defines the relevant metadata for interacting with the plugin. I like their architecture. This capability has lots of long-term implications. Do we need CLI tools in the future when we can just type (or say) āpush this change (that we just developed together in 5 minutes) to my repo and let me know if there are any merge conflicts we need to resolve or if anything fails in the pipelineā? What if we can say āmake sure there are no security vulnerabilities or breaking changes in this app and then push it to prodā? Wow.
This is going to be a fun technology conference season as everyone integrates with OpenAI everywhere on everything. I can see the keynote presentations alreadyā¦
For what itās worth, Steve Yegge (author of the infamous āgoogle rantā) agrees with me.
</hype>
Conclusions and Hypothesis
This isnāt magic and its not perfect, but if you know how to debug and make rapid changes to functional generated code, Generative AI looks like a huge force multiplier for developers. Everyone technical should be exploring possibilities and refining skills on how to prompt engineer for optimal effectiveness.
Everyone software engineering organization should be thinking about how they are going to accommodate this tool and give it access to internal knowledge bases for training purposes.
Knowing what to ask for and having enough technical context to know how to apply a response (or discard it and try again) is critical. If you know nothing about application and infrastructure development, deployment, or operations you are still going to struggle to be productive.
Being a generalist will pay off when you have a code-trained specialist at your side. In the near future, I expect the most valuable skillset will be a broad technical base that makes learning new things easy and [bias alert] an architectural mindset that enables you to know what to ask Generative AI for and how to correct and assemble what you get in return.
I suspect there will be lots of future value as more developer tools adopt AI, and as training on organizational knowledge improves. We do indeed have a long way to go but its pretty cool to have an assistant while you workā¦ one that happens to know everything since 2021 (and beyond via plugins). Even if it gets some math problems wrong. Weāre all human. š
I am awaiting access to the Bard preview so this has been ChatGPT centric.
Secrets from the Edgeā¦ and Cloud
Leave the Cloud! Go the Cloud! In the first article below, 37 Signals shares about their cloud journey from their own data center to AWS ECS to Google Cloud GKE to AWS EKS and finally away from public cloud and Kubernetes altogether, instead embracing their own data centers again with a docker architecture based on MRSK (built for Ruby and 37 Sig is traditionally a Ruby shop so not surprised). In the next article, we get the story about Uber moving away from their own data centers and to the public cloud as a result of challenges they faced with lumpy capital expenditures and large-scale automation in the data center business.
The to-cloud journey is obviously not unique as many organizations are still migrating to the cloud or doing new work there. The āfrom-cloudā or āon-prem-firstā story is not completely unique either. I have read a lot of similar stories recently. If you throw in one of my favorite architecture componentsā¦ Edge Computingā¦ we have companies actively moving workloads from and to just about every possible infrastructure solution. This goes back to last weeks letter where we talked about the Compute Mesh, which I believe to be the future of infrastructure. The ārightā place to run a workload depends on requirements (latency, bandwidth, legal), organizational capabilities (can you build and operate a datacenter? Edge?), the infrastructure vendor ecosystem (who can pull this off?), willingness to tolerate lumpy CapX vs smoother OpX (in exchange for potentially higher baseline costs), interest in being different vs being mainstream (which has talent implications), and of course organizational preference. Is anyone āwrongā for the decisions they are making?
Make everything into Terraform: I was interested in getting some real Terraform to play with (more about why another day) and I wanted something I had good context about. Enter Terraformerāan open source project created by an SRE on the Waze product at Googleāwhich can scan many of the resources in your cloud accounts (and lots of other things) and turn them into Terraform. This is a great little tool if you are a Terraform shop (or want to be). Remember from above, if you need to covert from some other structured format, ChatGPT is your guy (or girlā¦or AIā¦ whatever).
Many thanks for reading. š If you have any feedback for me [especially constructive criticism], reply to this email or by find me on LinkedIn. The Chamber of Tech Secrets is closed. See you next week.