The Unexpected Reality
I thought being an analyst meant creating dashboards and running elegant queries on clean data. I discovered I was actually becoming a human data pipeline in a family-owned berry empire held together by Excel macros and sheer determination.
Why This Experience Mattered
I wasn't trying to become a career analyst at Driscoll's—I wanted to develop my analytical toolkit. After my MBA, I was hungry for hands-on experience with what I thought was the core of modern business: extracting insights from data. I wanted to master Excel, learn BI tools, and become what I called "a master data manipulator"—someone who could make numbers tell any story they needed to tell.
The Learning Laboratory
What I walked into was a fascinating organizational paradox. Driscoll's was a multi-billion dollar agricultural operation run like a corner store. Because they were family-owned, they didn't have the same pressure to modernize their tech stack that venture-backed companies face. The patriarch could afford to let "that stuff wait." This created a unique environment where brilliant people were forced to push legacy systems far beyond their intended limits.
Every night, their ancient systems would churn out data in various formats. Then teams of analysts—myself included—would spend hours manually joining, transforming, and restructuring this information into reports that growers desperately needed to understand their operations. We weren't just analysts; we were the connective tissue between raw data and business decisions that affected thousands of acres and millions of dollars.
My Systems Thinking Advantage
While other analysts were focused on mastering individual Excel functions, I became obsessed with mapping the entire information flow. I didn't just ask "how do I complete this report?" I asked "who uses this information, what decisions do they make with it, and where are the breaking points in this process?"
I started tracing problems backward through the data pipeline. When end users complained they couldn't get the insights they needed, I didn't just band-aid the final report—I followed the data upstream to understand why we were losing granularity at each transformation step.
Key Insights Gained
First insight: The data model is everything. You can have the most sophisticated analytical skills in the world, but if your underlying data structure is wrong, you're building castles on quicksand. I learned that spending 80% of your effort getting the data model right means the actual analysis becomes almost trivial.
Second insight: Summarization is a one-way door. Once you've aggregated data, you can never disaggregate it with the same fidelity. This taught me to always preserve the most granular version possible and build summaries on top, not to start with summaries and try to drill down.
Third insight: In the age of cloud databases, the old trade-offs don't apply. Spreadsheets were invented because databases were hard to set up and maintain. But now you can spin up a PostgreSQL instance in minutes. I realized I was watching brilliant people work within artificial constraints that no longer existed.
How This Rewired My Thinking
This experience fundamentally changed how I approach any data problem. Now, when someone shows me a report or dashboard, I instinctively map the data journey backward. Where did this number come from? What transformations happened along the way? What context was lost?
I also developed what I call "data archaeology"—the ability to look at any organization's information flows and immediately spot where knowledge is getting trapped, distorted, or lost. Whether it's a startup's customer data or a Fortune 500's financial reporting, the same patterns emerge.
What I Bring Forward
I can walk into any organization and quickly diagnose their data ecosystem's health. More importantly, I understand that most "analytics problems" are actually "data engineering problems" in disguise. While others are debating visualization tools or statistical methods, I'm usually asking questions about data lineage, transformation logic, and structural integrity.
This isn't just technical knowledge—it's organizational intelligence. I've learned to see data flows as power structures, because information flow determines decision-making capability.
The Bigger Picture
The goal was never to become an Excel wizard—it was to understand how information actually moves through organizations and where it breaks down. At Driscoll's, I discovered that being an analyst isn't about having sophisticated tools; it's about being a systems thinker who can optimize the flow of information to improve decision-making quality.
Every organization, no matter how modern their tech stack appears, has some version of the "human data pipeline" problem. The companies that win are the ones that can see these invisible systems and engineer them better. That's the lens I now bring to every data challenge I encounter.