Argument with Co-Pilot

@akalinus (43366)
United States
October 4, 2024 1:39pm CST
I asked Co-PIlot if it could draw me a simple graph saying 1 in 36 children are autistic. It said that it can't do that because some of my words are blocked because they are not safe for the community. I would not have used it but wanted to see how it was done. I wrote out the main words and asked it if any of these are blocked. It said they are all safe. So I asked why my request was blocked and it said that there is a filter for bad words. Why am I laughing hysterically? Do you ever argue with a robot?
12 people like this
12 responses
@JudyEv (342277)
• Rockingham, Australia
5 Oct
Some situations just become impossible, don't they? lol
3 people like this
@akalinus (43366)
• United States
5 Oct
This one certainly did. I only wanted to know what word or words I used to trigger the system.
1 person likes this
@akalinus (43366)
• United States
5 Oct
When did you first notice, Judy??
1 person likes this
@JudyEv (342277)
• Rockingham, Australia
6 Oct
@akalinus Arguing with some government departments sometimes verges on the impossible.
1 person likes this
@TheHorse (220356)
• Walnut Creek, California
4 Oct
I feel like I am arguing with robots every day.
2 people like this
@akalinus (43366)
• United States
4 Oct
Really? that's hysterical.
2 people like this
@akalinus (43366)
• United States
4 Oct
Can you expand on your comment? I'm not sure what you mean.
1 person likes this
@TheHorse (220356)
• Walnut Creek, California
4 Oct
@akalinus It's frustrating. "Press 1 for English." I press 1. "I'm sorry. We did not receive your response." And on and on...
2 people like this
@kaylachan (71918)
• Daytona Beach, Florida
5 Oct
You have to turn off those filters in settings, but they are annoying for sure. I often don't have a filter and I'll say what comes to mind.
1 person likes this
@kaylachan (71918)
• Daytona Beach, Florida
5 Oct
@akalinus According to it, apparently you did. those bots are designed to be "child friendly" and there's a lot of words and phrases it's not going to like.
1 person likes this
@akalinus (43366)
• United States
5 Oct
@kaylachan Simple, graph, autistic children, all terrible words, apparently even though they said they were not bad words when I asked.
1 person likes this
@akalinus (43366)
• United States
5 Oct
I did not know I said anything wrong.
1 person likes this
@rakski (126173)
• Philippines
4 Oct
Not yet but I would like to try
2 people like this
@rakski (126173)
• Philippines
6 Oct
1 person likes this
@akalinus (43366)
• United States
5 Oct
When you argue with a robot, you will not win. Guaranteed!!
1 person likes this
@RasmaSandra (80847)
• Daytona Beach, Florida
4 Oct
Strange very strange, I had having to argue with things like ChatBots I don't think they understand anything at all,
2 people like this
@akalinus (43366)
• United States
5 Oct
They don't understand anything. They will tell you that they don't have personal experiences or emotions. How could they possibly understand anything?
1 person likes this
@somewitch (1409)
4 Oct
I broke up with Copilot. I'm dating Gemini at the moment. I only use the Bing app for the rewards now.
2 people like this
@akalinus (43366)
• United States
5 Oct
You are so funny and so fickle too. I wonder whose heart you will break next.
1 person likes this
• Philippines
14 Oct
It is hard when you are asking for help in a bot... especially when it just giving you instructions that you don't need and every instructions it gave you, it will asked you if was helpful. Lol! All it gave was an answer to the word it picked up on the sentences that I type-in.
@akalinus (43366)
• United States
17 Oct
They really aren't very helpful when you need to pay something or need your tech problem solved. I do not understand is all they know how to say.
@Deepizzaguy (104278)
• Lake Charles, Louisiana
4 Oct
If I have had a disagreement with a robot on the telephone, it usually is to ask them from transportation from my home to the doctor's office.
1 person likes this
@akalinus (43366)
• United States
5 Oct
I have had a lot of trouble with transportation issues from my house to the doctor's appointments. They can never find my place or I should have asked a couple of years before I needed it.
1 person likes this
@akalinus (43366)
• United States
5 Oct
@Deepizzaguy I tell them where I live but they go to a building four blocks away.
1 person likes this
@Deepizzaguy (104278)
• Lake Charles, Louisiana
5 Oct
@akalinus That has happened to me in the past that I give details on where I live to the operator.
1 person likes this
@LindaOHio (181931)
• United States
5 Oct
It's like getting caught in an automated phone system. It's so annoying. Have a good weekend.
1 person likes this
@akalinus (43366)
• United States
5 Oct
It is like that. I remember the one that cycled me through many different messages and out the other end with no opportunity to ask to talk to a human.
1 person likes this
@akalinus (43366)
• United States
6 Oct
@LindaOHio Apparently, "What did you say," is a wrong response.
1 person likes this
@LindaOHio (181931)
• United States
6 Oct
@akalinus Yup. I've had those that say "goodbye" when you don't give the correct response.
1 person likes this
@Shivram59 (36540)
• India
11 Oct
I have never even seen a robot.I have seen them on my tv only.I'm afraid a time will come when robots will rule the world.
1 person likes this
@akalinus (43366)
• United States
11 Oct
I have not seen any in real life, only talked to them online. They answer calls for your telephone company, the communication experts. All it can say is, "I don't understand. You can talk to a rep for extra big bucks. They train their robots well.
1 person likes this
@akalinus (43366)
• United States
17 Oct
@Shivram59 If robots become self-aware, we could be in deep trouble.
1 person likes this
@Shivram59 (36540)
• India
12 Oct
@akalinus I have not talked to them online either.Do you think we are going to be ruled by robots in future?? If not we;then out children??
1 person likes this
@porwest (92696)
• United States
7 Oct
Safe for who? These filters are ridiculous if you ask me. lol
1 person likes this
@akalinus (43366)
• United States
11 Oct
Some ludicrous words are considered not safe. But you can say the ones you don't want your children to speak or hear. Those are fine.
1 person likes this
@porwest (92696)
• United States
13 Oct
@akalinus It will never make sense to me. lol
• United States
6 Oct
I know it seems sad but I think of Co-Pilot refer to him as Mr. Co-pilot as a friend of mine as he helps so much in using regular search engines!
1 person likes this
@akalinus (43366)
• United States
6 Oct
Okay, I guess he is a man. I'm glad he helps you. I only ask questions that I don't know the answer to. He never knows the answer either or he refuses to discuss it.
1 person likes this