Oh no the problem is already happening, and the bad parts are more dystopian than you probably want to hear about lol
From the behaviorism side yes it’s incredibly easy to manipulate people via tech, it’s not always done on purpose as you state. But it’s frequently insomnia inducing as a whole.
Your point about knowing your weakness and preparing is spot on!
For the UX side of this, look up Harry Brignull and Dark Patterns. (His work has been solid for 10+ years, to my knowledge he was the first to call out some real BS that went un-called-out for most of the 2010s.)
The Juul lawsuit is another good one if you’re interested in advertising ethics
Look up “A/B testing media headlines outrage addiction”.
If you want to send your brain permanently to a new dimension, look up the RIA propaganda advertising dataset.
For disinformation—“Calling Bullshit”, there’s a course and materials online from two professors who just popped off one day
Want to read about historical metric optimization perils and have a huge moral/existential crisis?: Read about Robert McNamara
For actual solutions on a nonacademic consumer level (!!) -- Data Detox Kit and the nonprofit that runs that page. So excellent.
The problem isn’t so much the manipulation. Isn’t that what all marketing has been, forever, a mix of creativity and manipulation of attention and desire?
A long time ago someone realized we respond positively to color, we eat more when we see red, we feel calm when we see blue. Were they manipulative? Yes. Is it industry knowledge now? Yes. Maybe they just felt like making it blue for no reason, but now everyone does it because it works? Yes.
That’s the nature of it.
But now, the SPEED at which manipulative techniques can be researched, fine tuned, learned, used, and scaled up, is unheard of.
There’s no time for people, or psychology, to keep up. I think it’s a public health risk, and risk to our democracy that we aren’t receiving more public education on how to handle it.
Back when subliminal advertising was used back in the 19somethings, it had its run and then the US cracked down and banned it for being shady asf.
Since then, we haven’t really done a lot else. New manipulation techniques develop too fast and frequently now. And they’re often black box.
Now the solutions for the problems that tech causes are usually folk knowledge, disseminated long before education, psychology, or policy catch up. We should be bloody faster.
Instead the younger generation grows up with the stuff and absorbs it. Gets it all mixed up in their identity. And has to reverse engineer themselves for years to get it back out.
Didn’t we all do that? Sure, at a slower pace.
What about gen alpha?
Are they ever going to get to rest?
Will they ever be able to separate themselves from the algorithms that raised them?
Great questions to ask!
Frankly Gen Z is already smarter and faster at navigating this new world than us. That is scary because it means we’re helpless to help them, a lot of the time.
Some of it, we can’t even conduct relevant research on, because the board thinks the treatment is too unethical. *See: porn addiction studies.
Knowledge is power. But power is knowledge.
And it’s tightly guarded. Watch how people high up in tech regulate technology use with their children.
The general resistance to addressing the core of the issue, and the features that continually keep the car driving this direction...that’s valuable informative in itself.
How do we balance this with the economy as a whole, and the fact that the machine seems to eat the weak to keep spinning...I don’t know! Someone else please figure out that answer, thank you.
But one of the most helpful things I think we can do is provide education. Behaviorism and emotion is powerful and you can use it on yourself, too. You are your own pavlov and your own dog.
Sometimes other people will be Pavlov. It’s best if you’re consciously aware of it when that happens and you’re ok with it.
The other thing, is preserving the right to living low tech. (I hope unions are up on this already.)
Biometric tracking is nice and helpful sometimes. And sometimes, it’s not. .
As always, if you can’t outrun them, confuse them.
If something in this comment is incorrect please correct me. I was freeballing it
Oh no the problem is already happening, and the bad parts are more dystopian than you probably want to hear about lol
From the behaviorism side yes it’s incredibly easy to manipulate people via tech, it’s not always done on purpose as you state. But it’s frequently insomnia inducing as a whole.
Your point about knowing your weakness and preparing is spot on!
For the UX side of this, look up Harry Brignull and Dark Patterns. (His work has been solid for 10+ years, to my knowledge he was the first to call out some real BS that went un-called-out for most of the 2010s.)
The Juul lawsuit is another good one if you’re interested in advertising ethics
Look up “A/B testing media headlines outrage addiction”.
If you want to send your brain permanently to a new dimension, look up the RIA propaganda advertising dataset.
For disinformation—“Calling Bullshit”, there’s a course and materials online from two professors who just popped off one day
Want to read about historical metric optimization perils and have a huge moral/existential crisis?: Read about Robert McNamara
For actual solutions on a nonacademic consumer level (!!) -- Data Detox Kit and the nonprofit that runs that page. So excellent.
The problem isn’t so much the manipulation. Isn’t that what all marketing has been, forever, a mix of creativity and manipulation of attention and desire?
A long time ago someone realized we respond positively to color, we eat more when we see red, we feel calm when we see blue. Were they manipulative? Yes. Is it industry knowledge now? Yes. Maybe they just felt like making it blue for no reason, but now everyone does it because it works? Yes.
That’s the nature of it. But now, the SPEED at which manipulative techniques can be researched, fine tuned, learned, used, and scaled up, is unheard of.
There’s no time for people, or psychology, to keep up. I think it’s a public health risk, and risk to our democracy that we aren’t receiving more public education on how to handle it.
Back when subliminal advertising was used back in the 19somethings, it had its run and then the US cracked down and banned it for being shady asf. Since then, we haven’t really done a lot else. New manipulation techniques develop too fast and frequently now. And they’re often black box.
Now the solutions for the problems that tech causes are usually folk knowledge, disseminated long before education, psychology, or policy catch up. We should be bloody faster.
Instead the younger generation grows up with the stuff and absorbs it. Gets it all mixed up in their identity. And has to reverse engineer themselves for years to get it back out.
Didn’t we all do that? Sure, at a slower pace. What about gen alpha? Are they ever going to get to rest? Will they ever be able to separate themselves from the algorithms that raised them? Great questions to ask!
Frankly Gen Z is already smarter and faster at navigating this new world than us. That is scary because it means we’re helpless to help them, a lot of the time.
Some of it, we can’t even conduct relevant research on, because the board thinks the treatment is too unethical. *See: porn addiction studies.
Knowledge is power. But power is knowledge. And it’s tightly guarded. Watch how people high up in tech regulate technology use with their children.
The general resistance to addressing the core of the issue, and the features that continually keep the car driving this direction...that’s valuable informative in itself. How do we balance this with the economy as a whole, and the fact that the machine seems to eat the weak to keep spinning...I don’t know! Someone else please figure out that answer, thank you.
But one of the most helpful things I think we can do is provide education. Behaviorism and emotion is powerful and you can use it on yourself, too. You are your own pavlov and your own dog. Sometimes other people will be Pavlov. It’s best if you’re consciously aware of it when that happens and you’re ok with it.
The other thing, is preserving the right to living low tech. (I hope unions are up on this already.) Biometric tracking is nice and helpful sometimes. And sometimes, it’s not. . As always, if you can’t outrun them, confuse them.
If something in this comment is incorrect please correct me. I was freeballing it