EDCI568 ‑7. A Conversation with Bonnie Stewart — The Hidden Cost of Data, What We’re Giving Away Without Even Noticing

Late­ly, I’ve been think­ing a lot about datafi­ca­tion, a word that might sound tech­ni­cal but actu­al­ly describes some­thing we all expe­ri­ence every day. Dr. Bon­nie Stew­art recent­ly spoke about it in a con­ver­sa­tion with Dr. Valerie Irvine, and it real­ly made me think even fur­ther.

At its core, datafi­ca­tion is the process of turn­ing every­thing we do into data points, things we click, search for, type, or even delete. Dr. Stew­art explained that every time we use a device, whether we enter infor­ma­tion, erase some­thing, or sim­ply browse, we leave behind dig­i­tal traces. This data is then ana­lyzed, mon­e­tized, and often sold. We’re not just using tech­nol­o­gy, it’s using us.

It might not seem like a big deal at first. Who cares if an algo­rithm fig­ures out that I like wood­work­ing videos or that I order piz­za on Fri­days? But Stew­art made it clear that datafi­ca­tion is about much more than tar­get­ed ads. She point­ed out that we are liv­ing in a time where con­trol is shift­ing into the hands of the wealthy and pow­er­ful, par­tic­u­lar­ly those who dom­i­nate the tech indus­try. This isn’t just about con­ve­nience, it’s about who holds the pow­er over our infor­ma­tion and how it’s used.

One of the biggest prob­lems with data-dri­ven sys­tems is that they are often inac­cu­rate. AI and algo­rith­mic deci­sion-mak­ing are sup­posed to make things “smarter,” but they fre­quent­ly get things wrong. Dr. Stew­art shared an exam­ple of how LinkedIn keeps rec­om­mend­ing jobs in Win­nipeg even though she lives in Wind­sor. If AI hir­ing tools work the same way, how many peo­ple are being fil­tered out of oppor­tu­ni­ties sim­ply because an algo­rithm mis­un­der­stood them? And let’s be hon­est, a lot of AI-gen­er­at­ed con­tent out there is just “trash­tas­tic”, a mix of non­sense, half-truths, and spam designed to look use­ful but ulti­mate­ly makes every­thing worse. Google Search, once a trust­ed tool, is now filled with unver­i­fi­able AI slop, where algo­rithms con­fi­dent­ly gen­er­ate wrong answers that look con­vinc­ing but fall apart when exam­ined close­ly.

Beyond job search­es, data-dri­ven deci­sions are creep­ing into edu­ca­tion, gov­ern­ment ser­vices, and even law enforce­ment. Many schools have already shift­ed toward stan­dard­ized test­ing and ana­lyt­ics, valu­ing what can be mea­sured over things like crit­i­cal think­ing and cre­ativ­i­ty. When algo­rithms run social ser­vices, peo­ple who des­per­ate­ly need sup­port can be denied ben­e­fits because of flawed or biased data. And once an AI sys­tem decides some­thing about you, it can be very dif­fi­cult to cor­rect.

This is where dig­i­tal lit­er­a­cy becomes crit­i­cal. Valerie Irvine empha­sized that dig­i­tal lit­er­a­cy isn’t just about know­ing how to use tech­nol­o­gy, it’s about under­stand­ing the broad­er impli­ca­tions of how dig­i­tal sys­tems shape our world. She explained that if peo­ple don’t take the time to exam­ine the eth­i­cal, polit­i­cal, and eco­nom­ic issues tied to tech­nol­o­gy, they risk becom­ing pas­sive par­tic­i­pants in a sys­tem that prof­its off of their data while leav­ing them with few­er rights and pro­tec­tions.

She also point­ed out how today’s stu­dents have grown up in Google-dom­i­nat­ed learn­ing envi­ron­ments, often with­out ques­tion­ing the trade-offs. Many uni­ver­si­ties rely on cor­po­rate plat­forms like Microsoft, requir­ing two-fac­tor authen­ti­ca­tion and cloud stor­age that fur­ther embed users into pro­pri­etary sys­tems. Even delet­ing an account doesn’t mean the data is erased, com­pa­nies retain and repur­pose it.

This is all part of a larg­er shift in pow­er, where tech bil­lion­aires and wan­na be oli­garchs con­trol more and more of our dai­ly lives. Stew­art point­ed out that many dig­i­tal plat­forms start­ed out as use­ful tools but have slow­ly become prof­it-dri­ven data har­vesters. This trend, what writer Cory Doc­torow calls “shi­ti­fi­ca­tion”, explains why plat­forms like Google Search, which used to be reli­able, now feel clut­tered with ads and AI-gen­er­at­ed non­sense.

So what can we do about it? Dr. Stew­art empha­sized that the first step is sim­ply becom­ing aware of the issue. She acknowl­edged that it’s not real­is­tic to com­plete­ly dis­con­nect from these sys­tems all at once, since we still need to func­tion in the dig­i­tal world. How­ev­er, she encour­aged peo­ple to start ques­tion­ing their inter­ac­tions with tech­nol­o­gy, grad­u­al­ly iden­ti­fy­ing small ways to regain some con­trol over their data and dig­i­tal pres­ence.

That could mean being more inten­tion­al about what plat­forms we use, sup­port­ing open-source and non-cor­po­rate tools, or even just hav­ing con­ver­sa­tions about data pri­va­cy with friends and fam­i­ly. The goal isn’t to aban­don tech­nol­o­gy, it’s to stop giv­ing away our data with­out real­iz­ing what it costs us.

Next time you click “Agree” on a terms-of-ser­vice pop-up on the lat­est app, take a sec­ond to ask your­self, what am I real­ly sign­ing away?