Microsoft faces scrutiny over AI spending as Copilot adoption lags

midian182

Posts: 10,152   +135
Staff member
In brief: Microsoft will reveal its quarterly financial results on Wednesday, and it might not be a high point for the Redmond giant. Analysts expect Microsoft to report its slowest quarterly revenue growth in a year, and once again there are concerns about how much money the company is investing in AI when the demand and returns aren't justifying its outlay.

Microsoft has invested around $13 billion in ChatGPT-maker OpenAI since 2019, but we've been hearing reports since April that investors are concerned that reaping the financial rewards is taking longer than expected.

Reuters reports that Morgan Stanley analysts say there is a "wall of worry" around Microsoft's earnings due to "ramping capital expenditures, margin compression, lack of evidence on AI returns, and messiness post a financial resegmentation."

One of the biggest AI-related disappointments for Microsoft is Copilot. The company has been pushing its AI tools incredibly hard, going as far as adding a dedicated button to its latest laptops, but most people are apathetic toward Copilot, with the most common complaint reportedly that it's not as good as ChatGPT.

In September, Salesforce CEO Marc Benioff said Copilot is basically the new Microsoft Clippy, and that customers had not gotten value from it.

A survey of 152 information technology companies carried out by research firm Gartner in August found that the majority of them had not progressed their Copilot initiatives past the pilot stage.

Microsoft could boost Copilot uptake through an enterprise tool it unveiled last week. Microsoft Copilot Studio lets clients build AI agents that can automate internal tasks, fulfilling the sorts of administrative roles usually performed by employees. That's obviously brought a lot of criticism about AI replacing human workers, though Microsoft claims it will automate tedious tasks, freeing employees to focus on other, more important things, like looking for a job before they're replaced, probably.

Microsoft's stock price is up 14% this year, but it has only risen around 1% since late July, underperforming the benchmark S&P 500. While the Azure cloud-computing unit likely grew by 33% in the fiscal first quarter, matching company expectations, it is lower than in the fourth quarter.

Microsoft's total revenue is expected to have risen 14.1% to $64.51 billion. It says that spending on AI technology will remain high.

It was reported last week that Microsoft CEO Satya Nadella has seen his take-home pay increase by 63% compared to last year despite the CEO requesting the amount he receives be reduced. While Nadella's salary was cut by 50%, other forms of his compensation increased significantly.

Permalink to story:

 
CoPilot is a worthless security and privacy nightmare. MS is not going to see returns because it is not what people want.

I love the market is running scared and demanding actual returns now. All these moonshot spending projects are objective wastes of capital.
 
So it has begun. We're moving from speculating on how revolutionary will AI be to: What's the profit?

I guess the enshittification will start, prices go up whilst free functionality becomes less and less; whilst adding more and more ads.

Seems a bit early for Microsoft to put the brakes on but they have a history of that (just look at Windows Phone for the best example).
 
The need for privacy is fundamental and has deep roots in human psychology (it provides autonomy, security, control, freedom, creativity), and its disruption creates a series of negative emotions such as anxiety, distraction, distrust, shame, etc.

When artificial intelligence runs locally in a restricted environment (like the chatgtp in a browser application), even if it sends data remotely, it does not violate privacy because the rules are clear and the user knows that specific data that entered that environment will leave it to the cloud.

However, when AI is integrated everywhere in the operating system, there is no user control, and we have a privacy violation. It's like hitting a nerve that causes pain and therefore avoidance.

Whether AI models run 100% locally and users are 100% convinced that no data leaves their system, or the project will fail. There is no chance, not even one in a million, that operating system users will allow third parties to access their personal files in exchange for better organization. Even without arguments, I think it's obvious.
 
Copilot is just terrible... at least from a software developer point of view, which is all I've really used it for.

It's outright wrong about most things beyond very basic queries (which would be quicker to get an answer for with a quick web search).

But what's most infuriating is how often it tells you something is the answer to your query, you try it, tell it that it's wrong and it replies "Oh yes, sorry for the confusion, here's the correct answer". So it has got the correct data to give you accurate answers, but for some reason it just doesn't.
 
Copilot is just terrible... at least from a software developer point of view, which is all I've really used it for.

It's outright wrong about most things beyond very basic queries (which would be quicker to get an answer for with a quick web search).

But what's most infuriating is how often it tells you something is the answer to your query, you try it, tell it that it's wrong and it replies "Oh yes, sorry for the confusion, here's the correct answer". So it has got the correct data to give you accurate answers, but for some reason it just doesn't.
Sounds like every management team ever.
 
Maybe I missed it, but what is there to even adopt? Until an AI can help perform tasks that I would be doing myself, I'm not interested. There seems to be a lot of "cart before the horse" with this AI stuff.
 
"After months of people complaining about how much they don't want a metric S***Ton of privacy violating AI features, M$ is surprised at how low adoption is"

This is what happens when management and marketing department tell the engineers what to do.
 
Copilot is just terrible... at least from a software developer point of view, which is all I've really used it for.

It's outright wrong about most things beyond very basic queries (which would be quicker to get an answer for with a quick web search).

But what's most infuriating is how often it tells you something is the answer to your query, you try it, tell it that it's wrong and it replies "Oh yes, sorry for the confusion, here's the correct answer". So it has got the correct data to give you accurate answers, but for some reason it just doesn't.
One thing I found it useful is when I am not sure what I look for.
I cant form the right question therefore, I kind of ask bizarre questions trying to find what it is I am looking for.
As for the rest, I too do not have a lot of uses for it.
 
Copilot is just terrible... at least from a software developer point of view, which is all I've really used it for.

It's outright wrong about most things beyond very basic queries (which would be quicker to get an answer for with a quick web search).

But what's most infuriating is how often it tells you something is the answer to your query, you try it, tell it that it's wrong and it replies "Oh yes, sorry for the confusion, here's the correct answer". So it has got the correct data to give you accurate answers, but for some reason it just doesn't.
It really doesn't have any correct answers. It calculates probabilities of word combinations, tokens, and spits one out. When corrected, it's often right the second time. That just artificial stupidity.
 
Tried paid Copilot, and ChatGPT, Copilot prompts were annoying, filled with emojis, embedded web links, and inaccuracies. As a developer, it was a nightmare to work with compared to ChatGPT. For a normal person Copilot is probably fine, but I wouldn’t pay for it if I wasn’t using it heavily for dev work (ChatGPT either).
 
Back