My Nine-Year Career Blind Spot: Learning About Coercion in America..the hard painful way. A deep message for young Romanian peers.
For about nine years, I had what seemed like the perfect job for understanding hidden dangers. I worked in cyber threat intelligence, where my whole day was about finding vulnerabilities—flaws in software, patterns in hacker behavior, weaknesses that could be exploited.
The funny thing is, the most important weakness I needed to understand was right in front of me the whole time, and nobody ever talked about it in a briefing: how independence and autonomy quietly get eroded here in the U.S.
I showed up to America thinking I had a head start. Growing up in Romania after communism fell, I got a pretty direct education in how coercion works. You learn how authority can be used against you, how to spot when a group is pressuring you to fall in line, how systems can chip away at your ability to choose. I knew that old playbook.
So I figured, in America—with all the talk about freedom and rights—that playbook must be useless. I thought Americans, and especially the smart people I worked with in tech, just naturally knew how to protect their own freedom and the freedom of people around them. They seemed so confident about it.
Turns out, I overestimated things.
I confused the talk about autonomy with an actual, working system for defending it. I saw the strong ideals and the loud arguments and assumed there was a whole infrastructure built to protect individual self-determination. I didn’t realize that believing “it can’t happen here” is actually the biggest weakness of all.
For nearly a decade, in meetings about digital threats, I used my ingrained, Romanian-built understanding of coercion—my sensitivity to subtle pressure and manipulation—to analyze computer systems. I could pick apart a malware attack, but I completely missed the social engineering attack happening in my own career. The real attacks weren’t in my inbox; they were in the feedback culture, the office politics framed as “help,” and the legal processes that can slowly box you in.
My coworkers and bosses were experts in digital security, but most were pretty unaware of these human-level “TTPs” (tactics, techniques, and procedures). They could design a defense for a server, but not for a person’s right to self-direction. The coercive patterns I recognized from my past—isolation, weaponized bureaucracy, being punished for not fitting into a simple box—were just written off as “office drama” or a “personality clash.”
I was basically a perfect early-warning system for this kind of thing, but nobody had the sensors turned on for it. My lived experience with coercion was my most useful skill in that context, but in an industry focused on technical flaws, it stayed on the shelf.
The big lesson I learned is pretty simple: Coercion is like universal malware. It doesn’t care what country you’re from or what you believe in. It runs on basic human needs—to belong, to be safe, to do well. I made the mistake of thinking America had a version of humanity that was patched against it. Really, the exploit code is just written in a different dialect here, one that sounds like “corporate culture” or “the legal process” or “how things are done.”
I came to the U.S. to study cyber threats. What I ended up studying was a much older kind of attack—and my real training for it didn’t come from my job here, but from growing up somewhere I’d already seen the playbook in use.

