Lately, I’ve been thinking a lot about datafication, a word that might sound technical but actually describes something we all experience every day. Dr. Bonnie Stewart recently spoke about it in a conversation with Dr. Valerie Irvine, and it really made me think even further.
At its core, datafication is the process of turning everything we do into data points, things we click, search for, type, or even delete. Dr. Stewart explained that every time we use a device, whether we enter information, erase something, or simply browse, we leave behind digital traces. This data is then analyzed, monetized, and often sold. We’re not just using technology, it’s using us.
It might not seem like a big deal at first. Who cares if an algorithm figures out that I like woodworking videos or that I order pizza on Fridays? But Stewart made it clear that datafication is about much more than targeted ads. She pointed out that we are living in a time where control is shifting into the hands of the wealthy and powerful, particularly those who dominate the tech industry. This isn’t just about convenience, it’s about who holds the power over our information and how it’s used.
One of the biggest problems with data-driven systems is that they are often inaccurate. AI and algorithmic decision-making are supposed to make things “smarter,” but they frequently get things wrong. Dr. Stewart shared an example of how LinkedIn keeps recommending jobs in Winnipeg even though she lives in Windsor. If AI hiring tools work the same way, how many people are being filtered out of opportunities simply because an algorithm misunderstood them? And let’s be honest, a lot of AI-generated content out there is just “trashtastic”, a mix of nonsense, half-truths, and spam designed to look useful but ultimately makes everything worse. Google Search, once a trusted tool, is now filled with unverifiable AI slop, where algorithms confidently generate wrong answers that look convincing but fall apart when examined closely.
Beyond job searches, data-driven decisions are creeping into education, government services, and even law enforcement. Many schools have already shifted toward standardized testing and analytics, valuing what can be measured over things like critical thinking and creativity. When algorithms run social services, people who desperately need support can be denied benefits because of flawed or biased data. And once an AI system decides something about you, it can be very difficult to correct.
This is where digital literacy becomes critical. Valerie Irvine emphasized that digital literacy isn’t just about knowing how to use technology, it’s about understanding the broader implications of how digital systems shape our world. She explained that if people don’t take the time to examine the ethical, political, and economic issues tied to technology, they risk becoming passive participants in a system that profits off of their data while leaving them with fewer rights and protections.
She also pointed out how today’s students have grown up in Google-dominated learning environments, often without questioning the trade-offs. Many universities rely on corporate platforms like Microsoft, requiring two-factor authentication and cloud storage that further embed users into proprietary systems. Even deleting an account doesn’t mean the data is erased, companies retain and repurpose it.
This is all part of a larger shift in power, where tech billionaires and wanna be oligarchs control more and more of our daily lives. Stewart pointed out that many digital platforms started out as useful tools but have slowly become profit-driven data harvesters. This trend, what writer Cory Doctorow calls “shitification”, explains why platforms like Google Search, which used to be reliable, now feel cluttered with ads and AI-generated nonsense.
So what can we do about it? Dr. Stewart emphasized that the first step is simply becoming aware of the issue. She acknowledged that it’s not realistic to completely disconnect from these systems all at once, since we still need to function in the digital world. However, she encouraged people to start questioning their interactions with technology, gradually identifying small ways to regain some control over their data and digital presence.
That could mean being more intentional about what platforms we use, supporting open-source and non-corporate tools, or even just having conversations about data privacy with friends and family. The goal isn’t to abandon technology, it’s to stop giving away our data without realizing what it costs us.
Next time you click “Agree” on a terms-of-service pop-up on the latest app, take a second to ask yourself, what am I really signing away?