Flaire — From Product Feedback to Market Activation
At Flaire, I worked across both product usability and early-stage marketing strategy. Rather than focusing on just one lane, I approached growth holistically: understanding the product experience first, then thinking about how it could be positioned and expanded.


Researching Generative AI Adoption & Workplace Behavior
At UC Berkeley Haas, I worked as a research assistant analyzing how generative AI tools influence workplace behavior. My role combined qualitative and quantitative analysis, structured data organization, and emerging AI-assisted workflows to identify behavioral patterns across technical communities.
Research Question
The core question was behavioral:
When AI begins assisting with code review and generation, how does that change human interaction within technical communities?
We examined whether:
-
Developers relied less on each other for feedback
-
Comments became shorter or more transactional
-
Constructive critique declined
-
Engagement patterns shifted pre- vs. post-AI integration
The goal was not to evaluate the quality of AI — but to understand its social impact.
Methodology
To investigate this, I analyzed over 50 GitHub repositories, reviewing:
-
Code comments
-
Pull request discussions
-
Feedback exchanges
-
Patterns of engagement between contributors
Because the data was largely unstructured, I:
-
Built structured tracking systems in Excel
-
Categorized feedback types (constructive, corrective, automated-style, etc.)
-
Compared behavioral patterns across time periods
-
Identified shifts in tone, depth, and interaction frequency
I also used prompt engineering techniques to train AI tools to assist with organizing and clustering large volumes of text, while manually validating outputs to ensure reliability.
This combination of structured coding and AI-assisted organization allowed us to scale the analysis responsibly.
Behavioral Observations
Through the analysis, several patterns emerged:
-
Feedback appeared more efficient but sometimes less elaborative post-AI integration
-
Certain repositories showed reduced back-and-forth discussion
-
In some cases, comments shifted toward shorter confirmations rather than collaborative explanation
The research explored whether AI tools were subtly reshaping how contributors interacted — not just how they coded.
Reporting & Synthesis
I synthesized findings into structured summaries and presented insights to a PhD researcher.
This required translating technical repository activity into broader behavioral themes, such as:
-
Automation trust
-
Community dependency
-
Constructive critique patterns
-
Engagement density over time
Clarity and pattern recognition were central to the role.
Skills Applied
Analytical Tools & Methods
-
Excel-based dataset organization
-
Qualitative behavioral coding
-
Prompt engineering for text clustering
-
Pattern recognition across decentralized systems
-
Insight reporting to academic leadership
Why This Matters for Marketing & Product
This research reinforced an important insight:
When technology automates tasks, it also reshapes human interaction patterns.
The same dynamic applies in marketing and product:
-
Automation can increase efficiency but reduce emotional connection
-
Trust and community norms influence adoption
-
Behavioral shifts often occur subtly before they become visible
Understanding these dynamics helps inform how products are positioned, introduced, and supported in real-world communities.