Spark 434

Why we shouldn't be afraid to take our tech apart. AI can be easily fooled and this could have serious implications. Sell your own data instead of giving it away to big tech. New stock photo collection features trans and non-binary models.

Taking things apart, selling our own data, tricking AI to do the wrong thing, and a gender-inclusive stock pho

Spark - taking things apart is our middle name. (Adam Killick)

Gender-inclusive stock photo library goes beyond clichés

Stock photography isn't something most of us really think about. But those seemingly innocuous images we see —the photos that accompany news articles or ads— have the power to shape our perceptions. Spark host Nora Young speaks with Zackary Drucker, the photographer behind The Gender Spectrum Collection, a new stock photo database featuring trans and non-binary models.

Selling our own data could cut out middlemen like Facebook et al.

In the last few decades, tech companies have made billions off the data we've given them for free. Now, companies like Datacoup allow users to sell their data directly. Spark host Nora Young speaks with Datacoup founder Matt Hogan. She also speaks with Bart Custers, an Associate professor of law and digital technology at Leiden University in the Netherlands, who studies the value of our personal data to tech companies like Google and Facebook.

Taking tech apart

Canadian photographer Todd McLellan likes to disassemble tech objects in order to reveal their inner workings. He talks to Noar Young about his new book, Things Come Apart 2.0, which features 50 everyday items, from a Blackberry to an Amazon Echo, with each component neatly laid out with precision.

Attacking automated systems by 'tricking' AI

Say you're riding along in your autonomous car. You see a slightly defaced stop sign ahead but to your car's image recognition system, it looks like a speed limit sign, and the car sails right through the stop! Spark host Nora Young speaks with cybersecurity researcher Dawn Song, who has been able to demonstrate that AI systems can be fed malicious images or other inputs by attackers, with very scary results.


To encourage thoughtful and respectful conversations, first and last names will appear with each submission to CBC/Radio-Canada's online communities (except in children and youth-oriented communities). Pseudonyms will no longer be permitted.

By submitting a comment, you accept that CBC has the right to reproduce and publish that comment in whole or in part, in any manner CBC chooses. Please note that CBC does not endorse the opinions expressed in comments. Comments on this story are moderated according to our Submission Guidelines. Comments are welcome while open. We reserve the right to close comments at any time.