Emotion recognition systems are finding growing use, from monitoring customer responses to ads to scanning for ‘distressed’ women in danger.
Emotion recognition systems are finding growing use, from monitoring customer responses to ads to scanning for ‘distressed’ women in danger.
It’s about time we start holding the engineers building these technologies responsible directly.
I’m not talking about scientists expanding knowledge, I’m talking specifically about the engineers building these technologies.
Is mood recognition a tool useful for anything other than maintaining power over others (actually curious)?
Seriously. Why are people choosing to work for these companies? There are other ways to make a buck. Have some fucking morals.
The threat of homelessness and starvation is quite coercive. That’s why people still work at these kinds of jobs.
At a certain point, not just the companies doing this are to blame but the people working for them as well. Who tf can support this kind of thing? People need to have some self fucking respect.
For example we could probably have the cure for cancer right by now if they spent half the effort on it as they did making unbeatable thc drug tests for example. It’s clear where society’s priorities are. Improving lives does not generate profit.
There are uses for it. They can track the average mood of an entire room over a period of time. If you use that somewhere like a restaurant, or a banquet venue, then that information can be useful for tweaking the policies, environment, prices, etc. Of course an actual human could do this too, just by being there. I think it’ll get the most use at places like casinos where they’re always using psychological tricks to make people want to gamble. Ironically I don’t think that “happy” is the mood they’ll be aiming for.
Ya, I guess I can see some uses for it, but nothing that makes the risks of it’s existence worth it.
It seems like every tool/tech will be used by good people to do good things and bad people to do bad things. Some things like a spoon are handy for getting good things done but not very useful to bad people to do bad things with. Other tools like mood recognition might be quite handy for bad people looking to control others, but only moderately useful to good people.
Tools in that second group I think we should be wary of letting them exist. Just because something can be done doesn’t mean it should be done or that it can be called “progress”.
It has already existed for a decade or so. I’m surprised it hasn’t made headlines before. I saw a working demo of it at the Microsoft Visitor Center about 8 years ago. In addition to estimating your mood, it also assigns you a persistent ID, estimates your height, weight, eye color, hair color, ethnicity, and age. It is scarily accurate at all of those things. That ID can be shared across all linked systems at any number of locations. I completely agree with you that there are a lot of concerning, if not downright terrifying implications of this system. It’s a privacy nightmare.
You can to an extent, but that’s a losing venture. If pubic opinion goes against this tech hard enough, it’ll keep some people from working in those industries. BUT if those products are profitable enough, they will simply pay more and that’ll be moot.
Attacking the people who are earning a living isn’t the answer. Most people take the job with the best combo of pay and work/life balance they can find in their area, or if they can afford to move. Not that many have the luxury to pick and choose based on their morality. And if compensation is high enough, it’s a lot less likely.
It’s far easier to try to prevent this tech from being used at all. I know political action is hard as hell but it’s a lot easier than trying to ostracize an entire industry’s worth of workers. It may feel easier to denigrate faceless individuals but that won’t accomplish anything. Plenty of people work for weapons manufacturers and such.
And those are bad people. If you work to build technology used to maintain power when you have an option not to, what else can that be called? These people are not desperate for a job.
I’m an engineer, I quit \ (after the startup I worked for was acquired) because Intel powers much of the MI complex. I quit \ when it became clear I was directly assisting with state level genetic experiments. As an engineer I could easily get a job elsewhere where I was not directly contributing to the downfall of my fellow humans.
Take McDonald’s for example. There’s a difference between someone who needs a job working in a restaurant and an Engineer working for McDonald’s figuring out how to more efficiently slaughter animals paid only to be concerned about their employer’s profit – that engineer could go work to more efficiently bake cookies.
You’re painting with a firehose. Some people are.
You are what we call, privileged. Maybe you should…check it?
Yeah, I was a field service tech at a machine tool distributor for 15 years. One day about 7 years ago I realized that more of our customers than not were involved in some kind of arms manufacturing. Everything from components to military armaments to places making parts for AR-15s. Didn’t start that way but the business drifted into that market over time.
I decided to move on and it took me all of 5 years to find a position that; a) I was qualified for, b) paid enough that I wouldn’t lose my house and, c) was relatively safe from drifting into the customer base as the last company.
I don’t even have kids and this whole process was absolutely terrifying. I can easily see how someone with a family to support or less stability in their life wouldn’t feel like leaving was a possibility.
If you ever want a real General AI then it will need the ability to recognize the mood of the person it’s interacting with. ESPECIALLY if you want to use it for things like Mental Health Counseling.
Good thing. I DONT want a general AI
Thanks, that’s a valid answer like I was looking for. Though we don’t have actual AI and probably won’t have actual AGI for at least a good decade (we currently have machine learning and complex decision trees which appear kinda intelligent to us in 2023).