This week’s KJR Challenge: Read this Microsoft word salad: “Introducing Microsoft 365 Copilot – your copilot for work – The Official Microsoft Blog” and figure out what Microsoft 365 Copilot is. Or, failing that, figure out what it does.
The linked blog entry was attributed to Jared Spataro, Microsoft’s Corporate Vice President, Modern Work & Business Applications.
Which leads to your next KJR Challenge: What on earth does that job title mean?
Meaning no offense, Mr. Spataro, but the only reason I have any confidence that you’re a Live Human Being and not a ChatGPT avatar is that I can usually make heads and tails out of a ChatGPT essay.
That, and that your average ChatGPT essay doesn’t include so many questionable assertions. Examples:
“Humans are hard-wired to dream, to create, to innovate.”
No, we aren’t. To the extent we’re hard-wired to do anything it’s to increase our DNA’s representation in the future population’s gene pool. And even that hard-wired drive is buffered by a bunch of intermediate effects.
“With Copilot, you’re always in control. You decide what to keep, modify or discard. Now, you can be more creative in Word, more analytical in Excel, more expressive in PowerPoint, more productive in Outlook and more collaborative in Teams.”
No. With Copilot we won’t be more creative in Word. With Copilot we mere humans will stop being creators. Copilot will turn us into editors instead.
I have nothing against editors. But editing isn’t creative and isn’t supposed to be creative.
Oh, and by the way, I might not be feeling collaborative; sometimes I don’t feel collaborative for intensely valid reasons. If Copilot were to make me more collaborative in Teams I most definitely wouldn’t be in control.
“With our new copilot for work, we’re giving people more agency and making technology more accessible through the most universal interface — natural language.”
Microsoft apparently buys into Springer’s Law, named after my old friend Paul Springer, who asked, “Why use a picture when a thousand words will do?”
Oh, and by the way, people misunderstand what’s said to them all the time. Why would we expect Copilot to be better at interpreting natural language than we human beings, who have had tens of thousands of years of practice at it.
Just my opinion: Clicking on an icon is faster and more efficient than using sentences to explain what you’re trying to do.
“… every meeting is a productive meeting with Copilot in Teams. It can summarize key discussion points — including who said what and where people are aligned and where they disagree — and suggest action items, all in real time during a meeting.
Okay, this is just silly. Or else, terrifying. Unless Copilot can barge in and mute everyone’s microphone to say, “You’ve made this point thirteen times already, Fred. Please stop so we can move on,” it won’t make meetings more productive.
Copilot “… creates a new knowledge model for every organization — harnessing the massive reservoir of data and insights that lies largely inaccessible and untapped today.”
The ever-helpful Bing implementation of ChatGPT explains that,” A knowledge model is a computer interpretable model of knowledge.” Yes, that’s right. A knowledge model is a model of knowledge. And that’s the best definition of “knowledge model” I could find.
One more: “Uplevel skills. Copilot makes you better at what you’re good at and lets you quickly master what you’ve yet to learn.”
Except that as it turns out, Copilot doesn’t “uplevel” [don’t blame me for this linguistic abomination] anyone’s skills. So far as I can tell it doesn’t show you how to do something. It does whatever-the-task-is for you.
But delegation is a skill, so I guess gaining the ability to delegate to Copilot constitutes “upleveling” your delegation skills.
But it’s a stretch.
Bob’s last word: Don’t get me wrong. A year ago I was impressed with Google’s semantic search capabilities. Now, more and more I’m complementing it with Bing’s generative AI research summarizations. Its abilities are impressive, and I expect Copilot and similar technologies will turn out to be highly consequential.
But as impressive as generative AI is, it also encourages me to be lazy.
For this I don’t need encouragement. And if we’re going to equate laziness and increased productivity … I think we’re going to need a new knowledge model to sell the idea.
Bob’s sales pitch: Every time I email a fresh column to the assembled KJR multitudes, my mailing service drops those subscribers whose emails are bounced due to mailbox full or other errors. The result is a slow but steady erosion of KJR’s subscriber base. The only way to replenish is for subscribers like you to encourage non-subscribers like that guy three cubicles to the left of you to sign up.
How about it?
Now on CIO.com’s CIO Survival Guide: “Why IT surveys can’t be trusted for strategic decisions.” All surveys will tell you is whose company you’re keeping.
How dreadful! “ Uplevel skills. Copilot makes you better at what you’re good at and lets you quickly master what you’ve yet to learn.”
‘No one can quickly master anything!’
Good point, except when we “downlevel” mastery.
I, for one, refuse to submit to our AI overlords. If some IT department rolled Co-Pilot out and demanded that everyone use it, I’d have it type “The quick brown fox jumps over the lazy dog” over and over while I got my real work done.
I needed a good laugh this evening. That’s for demonstrating how this advertising gobbledegook is just so much B.S.!
It’s like: We are here from The Government/Microsoft here to help.
Sure you are. I think MS asked Chat GP-4 to create marketing BS for the extension of Office 365 with some AI features – and this is what they got.
To MS 365 Copilot: UGH! At one time, my organization’s Outlook 365 program insisted on “suggesting” words/phrases based on what I was keying. It was awful and almost never suggested anything even remotely like what I was trying to say. I think they finally gave me the means to turn it off (or it finally disappeared, frustrated because I never accepted any of its suggestions). I hope the “365 Copilot” feature can be turned off!
Oh God… in following your link to that word salad, I discovered that there is also now a Copilot feature in Microsoft Dynamics. Which is, among other things, AN ACCOUNTING PACKAGE.
It’s not enough that Microsoft Word is now going to hallucinate essays that look plausible but are in fact wrong. Now AN ACCOUNTING PACKAGE is going to hallucinate things that look plausible but are in fact wrong!
Copilot “… creates a new knowledge model for every organization”. But ChatGPT is NOT a knowledge model, it is a LANGUAGE model! It does not UNDERSTAND anything that it reads or says; it merely knows which words appear nearby to which other words.
I sure hope that whatever A.I. is now built into the Copilot feature of Microsoft Dynamics is NOT based on ChatGPT. And here’s hoping that it can be turned off.
We all know this is just going to be some privacy-invading super evil Clippy who has access to your communications. “It looks like you’re having an affair, would you like to use Outlook to schedule a meeting with divorce attorney?”
Everyone who remembers Clippy raise your hands!
Comments are closed.