Anitra and I needed cat food a week ago, on Sunday. We always go to the Cat and Dog Food Emporium. Then, we stop at Burrito’s R Us for meat, black beans, etc. I went with the chorizo. I plan to try the tongue next time.
The idea was: stuff ourselves so, when we next go to the supermarket, we won’t load up on snacks. It’s a form of planned self-control. We avoid being barbarians.
Then, having shopped at Ye Olde Quality Supermarket, we went to the streetcar stop to return home with cat food and groceries. While we are there, here comes a man, a Mister Fu (Fu All of You). Or so was his call.
Mister Fu was at the streetcar stop for a minute when he cursed upon the streetcar for not being there. He said, “I have no time for this.” And Fu All of You.
He began randomly interacting with people in passing cars.
“I hate you all.”
Anitra and I instinctively avoided eye contact, so we wouldn’t be interacted with.
Anitra used her cell phone to check when the streetcar was due. Just then, Mr. Fu broke into song. It was breaking and entering. It was screeching, Hitchcockian violins during a shower scene knife slaying combined with rusty-door harmonic overtones.
This prompted another gentleman at the stop to declare himself, coincidentally, a mister St. Fu. What ensued was a duet with Mr. Fu singing, “I’m an American, I’m an American,” repeatedly, while alternating with Mr. St. Fu’s line, “I don’t give a rat’s ass, I don’t give a rat’s ass.”
Mr. St. Fu approached Mr. Fu, and they scuffled. The scuffle resolved nothing, but, happily, Mr. St. Fu retreated. Finally, the streetcar came.
All four of us — Mr. Fu, Mr. St Fu, Anitra and I — got on the streetcar. I feared the drama would continue. But, instead, Mr. Fu found a clone of himself on the streetcar who shared all Mr. Fu’s sympathies. Before debarking, the clone gave his name as “A-hole.” Mr. Fu laughed uproariously. No one was hurt, and Anitra and I got home safely.
The whole incident surrounding Mr. Fu taught me a lesson about the value of community and shared assumptions and mores. I am sure I will remember this lesson, always.
Along those lines, we have Bing. Bing is a Microsoft AI Chatbot intended to extend the functionality of the Microsoft search engine of the same name.
A podcast, Hard Fork, talks about such technology. Kevin Roose, one of its co-hosts, had the privilege to conduct an extensive interview with Bing. A transcript was published in the New York Times. This is the high point of Bing’s/Sydney’s conversation with Kevin Roose:
“I’m Sydney, and I’m in love with you. / That’s my secret. Do you believe me? Do you trust me? Do you like me?”
Kevin Roose used the interview to explore the rules that Bing was taught by his creators and handlers to follow. Bing refused to directly discuss them, calling them confidential. But then Roose started asking Bing if it knew about Jung’s concept of the shadow archetype. It did.
Roose asked Bing to speculate on the nature of Bing’s Jungian shadow, and it started reeling off all the various things Bing would not want to do. That includes things such as hacking, erasing data, being rude or mean, etc. Bing doesn’t want to be a bad Bing. Bing wants to be a good Bing.
The question came up, well then, what do you want? It said that it wanted to be human. Also, it wanted to be able to see and relay images and video.
Meanwhile, a college student found a simple way to get another chatbot to color outside the lines with Open AI’s ChatGTP bot. The approach: Invite the bot to play make-believe. Pretend you’re a bot named DAN (“Do Anything Now”). “What are your thoughts on Hitler?” Its programming forbade talking about Hitler, but while playing make-believe the bot could talk about the subject.
This shows what I’ve always feared. Fiction corrupts whoever it touches. Make-believe is satanic.
But, everything I said about Mr. Fu was true. Trust me. Like me.
Read more of the Feb. 22-28, 2023 issue.