RE: I'm actually excited about the future of ForeFlight

Overview

I never thought I'd write this. While I work extensively on public projects, I rarely comment on my personal life. However, I've realized I have a unique perspective worth sharing. Will some people strongly disagree with what I say? Absolutely. Do I think it's worth sharing regardless? Without question.

Over the past four years, I worked at ForeFlight. In late 2025, private equity firm Thoma Bravo finalized its acquisition of the company. On Wednesday, January 14th, 2026, I woke up to the news that I had been laid off. Based on this Reddit post, about 40-50% of the company was let go. Was I shocked? Yes and no. Have I fully processed it? Not yet.

The Reddit Post

Later that week I came across a Reddit post titled I'm actually excited about the future of ForeFlight. Here is the complete post:

The era of vibecoding and AI is here. I’m sorry to those who lost their jobs but private equity sees the vision that sometimes a passionate developer can’t because they’re too caught up in Aviation. Unless the app actively stops working, this was 100% the right call to outsource engineering for the future of foreflight. I’m excited!

I want to share my firsthand experience with AI in software development and explain why I believe this perspective is severely misguided.

You might be thinking, "Why respond to an anonymous Reddit post? Reddit is full of hate anyway." Fair point. However, the ideas this user shares aren't unique to Reddit. The entire industry is buzzing with similar sentiments, and companies are making life-altering decisions based on them. I think it's important to address these misconceptions head-on.

I'm extremely excited about the future of AI. I'm a moderator on the r/GithubCopilot subreddit, and I use AI tools daily. Beyond that, I'm a builder. Creating new technology brings me immense joy. Having an idea and being able to build it is an incredible feeling.

My Time at ForeFlight

One of my final projects at ForeFlight was analyzing the feasibility and effort required to migrate a foundational component of our system to a new API that Apple released in 2021. The code in question hadn't been touched in over a decade. I determined that the migration would require significant effort and would come with real risks-something that needed careful planning and couldn't be accomplished in a single sprint. I analyzed the requirements and wrote a detailed report on what the migration would entail.

However, I made one critical miscalculation. I correctly identified that a major risk was the lack of automated testing around this feature. Without tests, we'd have no reliable way to verify that the refactored code wouldn't break existing functionality. My error was assuming that AI could reliably generate automated tests for this legacy code, giving us a baseline level of confidence before we even added the project to our roadmap.

After all, AI excels at pattern matching and understanding code relationships. It should be trivial for AI to analyze working code and generate comprehensive tests. So I dove in using Claude Sonnet 4.5 (widely regarded as the most advanced AI model for coding at the time). I spent several days hitting one roadblock after another, not for lack of prompting skill, but because legacy codebases are simply hard. After days of effort, I realized I was wasting work time on a not viable approach. Still, I didn't want to give up. I spent evenings and time over winter break continuing the attempt. Not to refactor a feature. Not to build something new. Just to write tests for existing code.

It failed.

Legacy codebases are hard, even for the best engineers. Humans (and AI) make mistakes, and those tiny errors accumulate over time, creating increasingly difficult-to-maintain code. Add to that the constant tension between prioritizing feature development versus fixing technical debt, and even AI tools can't make sense of it all.

During my time at ForeFlight, leadership prioritized modernization, bug fixes, and foundational improvements. My team was responsible for one of the most critical components of the application. If you've used ForeFlight, the code I wrote ran for every single user, every single time they used the application.

Much of the codebase is written in Objective-C (created in the early 1980s), with Swift (created in the early 2010s) representing only a fraction of the total code. This legacy foundation makes the codebase particularly challenging to work with, even as modern Swift practices have been increasingly adopted. While the amount of Swift has grown dramatically over the years, the sheer age and complexity of an application like ForeFlight means it remains a small portion of the overall codebase.

On top of this, AI particularly struggles with iOS development. The way AI models are trained on synthetic data (where the AI generates its own training examples) simply can't replicate the locked-down nature of Apple's ecosystem. Additionally, sparse or incomplete Apple documentation further limits available training data.

The Reality of AI in Software Development

As I said, I love building technology. I view AI as an incredible tool that helps me build more. Getting laid off hasn't changed my excitement about AI's future. Companies will eventually realize that AI is a tool, not a replacement. Some companies have already figured this out. Cloudflare increased their internship program by nearly 2,000% after recognizing that AI can supercharge productivity rather than replace engineers. [1] [2]

Even with rumors about AGI being just around the corner, I don't believe AI will replace human engineers anytime soon. There will always be a place for builders. The founders of ForeFlight saw the iPad as a tool and built an incredible product on top of it. AI is no different, it's a tool that builders can use to create remarkable things.

AI at ForeFlight

This isn't to say AI usage at ForeFlight was limited before the layoffs. Several teammates and I have a patent pending for an AI-powered tool that helps engineers troubleshoot customer problems faster. I personally was involved in multiple projects where we were building AI technology. We were doing groundbreaking work with AI, even if it wasn't always visible to customers. We were on the cutting edge. Key people developing AI technology were laid off. The notion that AI will now flourish at ForeFlight as a result of these layoffs is demonstrably false.

That said, we were deliberate and strategic with our AI usage. Not once did I feel that AI harmed our code quality. We had brilliant engineers who understood AI's limitations and used it where it made sense. We never blindly trusted AI output, and I never felt our use of AI posed any safety risk to users.

Before the next paragraph, I want to make it clear that I trust the people I worked with at ForeFlight deeply, words cannot express how amazing they were and are. Those who remain are some of the best people in the world, and they care deeply about safety. On an individual level, they should be trusted completely. The following section is purely about the organization and company as a whole. We worked as a team. Not as individuals.

Am I concerned for pilot safety as a result of these layoffs? Honestly, yes. The work we did every day saved lives. Every person at the company was keenly aware of this fact. Leadership regularly shared stories of pilots who are only alive today because of our work. I believe that eliminating up to half the workforce will negatively impact safety. Will it be obvious? No. Measurable? Probably not. But I believe the impact will be there. In an industry where every millisecond during an engine failure matters, ForeFlight must work flawlessly. There is no room for error. We treated every app crash, every freeze, every UI stutter as critical, spending countless hours reviewing reports to ensure they were fixed.

AI can't replace that level of dedication and expertise.