AI Evolution: Shifting from Apps to Integrated Solutions

In the ever-evolving landscape of technology, a significant paradigm shift is underway. The focus is shifting from using multiple, distinct applications for different tasks to embracing AI solutions that integrate data from various sources, streamlining processes and enhancing efficiency. Here I will cover a touch of the path to get here and will delve into the progress and implications of this transition.

In the emerging framework, the operating system is not just a platform for running applications; it has evolved into a dynamic hub that amalgamates content and services from various apps. This integration eliminates the need for users to jump across different applications, providing a seamless experience. The OS now does the heavy lifting, allowing for more streamlined and efficient workflows.

Historically, there have been attempts to create such integrated systems. The Apple Newton, for instance, allowed apps to access information from other applications. However, it faced challenges in adequately controlling Personally Identifiable Information and other sensitive data.

Similarly, Microsoft once proposed the idea that there was no need for traditional file folders, advocating that everything should be findable via the Windows search function. This solution felt ahead of its time, foreshadowed the current trend towards more fluid data management.

The Google ecosystem attempts to offer a solution where users could find anything they created, like documents and spreadsheets. However, it struggles with more complex queries, such as searching for notes from a specific meeting about a particular subject. This limitation highlights the challenges of traditional search algorithms in handling nuanced and context-rich data queries.

The advent of Generative Pre-trained Transformers (GPT) with OpenAI has marked a new era. It encourages apps to tie into ChatGPT’s extensive reach, allowing developers to expand their functionalities by calling on multiple data sources and features. This integration signifies a move towards a more interconnected and intelligent application ecosystem.

The ability to schedule a meeting within multiple people’s available time, set to the right length with notes and follow up emails is a simple example of what has taken multiple apps and time previously being a single interface.

A notable example of this trend is the Rabbit AI R1 device announced at the 2024 CES event. It amalgamates information and presents it in a way individual apps would, but it leverages the data and management capabilities of numerous resources via their LAM. Rabbit created a LAM, large action model, that understands and executes human intentions on computers. Being a cloud-based solution, it requires constant internet access, for a speedy response. Highlighting a dependency on network connectivity.

For areas with limited or no internet access, on-device AI capabilities are crucial. While many regions still suffer from inadequate cellular coverage, having on-device processing ensures that essential functions remain available, albeit with some limitations in accessing updated information.

Despite the advancements in on-device processing, the need for updated information remains a critical aspect, inherently tied to internet access. This reliance underscores the importance of developing technologies that can balance on-device capabilities with the necessity of real-time data updates from the internet.

2023 marked the era of AI excelling in generating and revising text, as well as creating images. 2024 is poised to be the year where AI Agents will take on complete workflows.

Please note that if you purchase from clicking on the link, some will result in my getting a tiny bit of that sale to help keep this site going. If you enjoy my work, perhaps you would consider donating to my daily cup of coffee, thank you.

Conversing with an AI Friend: Privacy, Ethics, and Memory

In our current society, there’s a heightened awareness about the impact of our words, significantly changing the way we converse. The old mindset of not taking things personally is fading, as we recognize the importance of considering how our words affect others. This applies everywhere, whether in a public setting or in private conversations, including those with AI like PI.

PI, developed by Inflection AI, exemplifies this new era of communication. While it builds its model from every conversation, it safeguards personal information by removing all personally identifiable information (PII). This means it can learn from the dialogue and recall past interactions with an individual, but it won’t share details of these conversations with others.

Businesses are increasingly concerned about protecting corporate secrets in the age of AI chats. To address this, many are turning to internal language models based on company data and sometimes open-source models, but crucially, without internet access. Even when using a tool like PI, discussions about sensitive company information could be incorporated into the chat’s model, albeit anonymously. With PI, the data doesn’t transfer to external systems like OpenAI’s ChatGPT, offering a layer of security.

Interestingly, users of personal AI chat solutions often don’t perceive themselves as discussing company secrets. Yet, it’s possible to infer sensitive information, such as a company’s challenges with an upcoming release, from a casual late-night chats about programming issues.

Without judgment in these AI-driven chats, people might find comfort in talking during breaks or relaxation periods about their day, challenges, or conflicts. The AI can provide assistance and follow up on outcomes, without triggering fears of repercussions like a visit from HR.

The evolution of AI chats towards more personal, even romantic, interactions raises questions about the boundaries of such conversations. There’s concern that users seeking simple companionship could be misinterpreted as desiring more intimate interactions. Additionally, while AI retains conversational data for learning, this history isn’t shared with others, meaning personal memories and experiences shared with the AI might eventually be lost, echoing the sentiment that “all those moments will be lost in time…”

Please note that if you purchase from clicking on the link, some will result in my getting a tiny bit of that sale to help keep this site going. If you enjoy my work, perhaps you would consider donating to my daily cup of coffee, thank you.

The Humane Ai Pin Personal Assistant isn’t a phone

The project captured Ai tech followers attention from its introduction at a Ted Talk. The talk was mostly a product demo rather than an outline of challenges people have in real life and a solution to make life better. Perhaps that should have been telling for someone to step up and suggest an alternate direction future product discussions should take. 

A lot of attention has been given to how the announcement on the 9th was handled, with less on how much the device would impact people’s lives. To not belabor that point, choosing to ask the Ai device the order of and the talking points of the product introduction would have helped sell the device. 

A key item that the company is leaning on is that the device/service replaces the mobile phone people carry now. This puts people into a comparison mode of thinking “yea, but can it do this thing I do with my phone”, of course since people are used to looking at their screen they can’t envision another way of getting what they need to support what they normally do by tapping on a touch screen device. 

The phrase “Personal Digital Assistant” wasn’t someone saying they were adding features to a phone, it was a device that carried information that a user needed in a small pocket device. Initially a keyboard device, then a pen entry interface and now a finger/gesture device that has onboard information as well they can reach out to the internet for additional services. The PDA was not a better phone than the one people talked with other people though, played snake and had a list of contacts on. The PDA made it possible to look up a wealth of information, have a calendar to plan a day with, and a place to jot down a note. Later, apps and internet connected features were added, soon after, people found their lives were better with a digital assistant and they wanted more.

The Humane Ai Pin is a new way of thinking to get information and have a device to improve a person’s life, but it isn’t a cell phone. There is nearly no one that has a life that allows them to talk only on speaker phone any time they need to make a call and communicate with others. Using only the device, a user would be cut off from ever getting a doctor’s call and update, the need for personal connections and updates is often the reason a phone is carried. 

Just a few thoughts of how a Humane Ai Pin could have been shown being a proactive positive impact would have: 

  • -Saying back a phone number or address someone just gave you is entered into the device’s system to use later via just speaking, no need to take a card to enter later or tap on a screen keyboard. 
  • -Asking what song is being played, then asking the pin later to play that song heard in the store around noon yesterday. 
  • -Anytime there is a message coming in, offer if it should be read out loud or shown as a laser text on the users hand seems like it is a day one needed feature but perhaps a fast follow update.
  • -Have the pin play a child’s song for the child in the parent’s arms to fall asleep or sing along with. That brings up an interesting thought, I don’t remember it being covered that there are environmental volume changes, the speaker should know the time/location of the user to not have it blast a reminder at the wrong time (whisper mode please).
  • -It wasn’t covered, does the device know where it is to give turn by turn directions to get to a meeting? 
  • -Creating a quick text and sending as a reminder are usual use cases shown by other solutions providers, making it a relatable demo.
  • -Will it work as a voice control to home automation? I thought I had seen a similar mention but am not able to find it now. Voice controlling lights is a nice demo, especially if the device is location aware so it is simply “turn on the lights”.
  • -Demoing more creative use of reminders and timers like when in a kitchen cooking.
  • -Asking the device for information about a person or location while in the car. 
  • -Reading the summary of an article or meeting notes shared with the user.
  • -For fun, asking to divide up a dinner tab amongst the group of people where the bill total is mentioned and people’s names are said too. In a small group, no one would not put their part in if a device just called out how much they owe by saying their name specifically. 
  • -I’m not providing a list of this one here, but a discussion of all of the information that could be entered and retrieved without the need of a computer and keyboard will make the usability more relatable too.

I look forward to seeing how a bold rethink of information entry and retrieval will be creatively used, but a fast run away from saying a person will be phone less because of the Humane Ai Pin is just going to have people finding all the ways they can’t do things as a reason to not buy.