[ad_1]
For more than four decades, since the 1980s, pointing and clicking has been the primary method of using a personal computer.
Of course, the traditional computer mouse isn’t going away. But what if it could be augmented more frequently by simply thinking and typing, rather than hunting around on your computer for some sort of setting, or other solution to whatever you’re trying to do?
That’s one promise of AI. Rather than remembering the keyboard shortcut to take a screenshot, or the setting to shift the screen into dark mode, for example, you could just tell the computer what you want it to do in natural language.
That, at least, is what Microsoft is promising with the development of Windows Copilot. It’s an adaptation of its Open AI-powered Bing search chatbot, integrated directly into the operating system, showing as a persistent sidebar once activated by users via a new taskbar button.
This week, the company started to give users in the Dev channel of the Windows Insider preview program an early look at Windows Copilot. It’s rolling out slowly, with limited features, but it offers a glimpse of where the company is headed. The company hasn’t yet provided a timeline for a broader rollout to Windows users.
In addition to letting users interact with Windows in a new way, Windows Copilot will integrate with third-party apps via plugins.
So what does all this mean for the future of Windows and computing? On this episode of the GeekWire Podcast, we’re featuring a conversation with Aaron Woodman, the Windows vice president of marketing, recorded shortly after Windows Copilot was unveiled a few weeks ago.
Listen above, or subscribe to GeekWire in Apple Podcasts, Google Podcasts, Spotify or wherever you listen.
[ad_2]
Source link