At the end of March I received some unfortunate news that many have been receiving lately - the company downsized and eliminated my position. This kind of took me by surprise since I was always under the impression that things were OK, that our company was ahead of the curve on this (they let go some people over the summer as well). But I understand it's hard to avoid pressures that not even FAANG companies are insulated from. The silver lining in this? I have 45+ more hours per week to do whatever I please with. Time that cannot be bought anywhere else.
After nearly 7 years of working I realized that I might not get this same opportunity again. So, earlier this month I began writing some apps again. I conceived of, implemented, and deployed 3 apps so far. One per week. They range in complexity from being a simple static site with some vanilla JS/CSS to having a frontend + backend with a sqlite database. It seems like an efficient pace, and assuming I have no idea what idea is good/will succeed, I won't lose too much time on any one project. I have some access to A.I. now which I'm experimenting with and seems to be adding some value to my development. I'm finding that chatGPT is actually a nice programming assistant that definitely cannot do everything I can do. But it can answer my questions pretty well and has only hallucinated badly once (imaginary imports and functions).
What will the next generation of apps based on A.I. look like? I think it's already clear that information retrieval is a huge success - I'm testing the limits of this and how much of an "expert" a language model can be. Initial results are impressive but flawed. Hallucination was again a big problem. Each version is improving so much that I'm confident the issue won't be nearly as much of a problem in the future. Mechanisms/workarounds for this already exist - simply tell the agent it is wrong and it attempts to correct its answer. But for a model to be used in production in a broad range of industries, hallucinations are unacceptable. Imagine if it started giving you fake news and facts, dubious medical advice, buggy code... That could give rise to a whole new class of lawsuits.