Tech
Do you frequently use ChatGPT? A study says that you’re lonely.

Imagine communicating your awful day with a chatbot, only to feel even more isolated as a result. While most individuals use ChatGPT for practical chores, a tiny minority of heavy users may be trading in instant comfort for a developing sense of loneliness or even emotional reliance, according to new research from OpenAI and the Massachusetts Institute of Technology (MIT).
First study: Behavioural analysis
The results are based on two investigations. The first was an observational examination of around 40 million ChatGPT encounters, in which researchers looked for indications of emotional engagement in talks using automated algorithms. Through the evaluation of more than 4 million chats and the use of focused user surveys, the team was able to establish a relationship between the kinds of interactions users had with the chatbot and their self-reported attitudes.
This extensive study offered a comprehensive understanding of ChatGPT usage and demonstrated that affective, or emotionally expressive, interactions are generally somewhat uncommon. However, a tiny fraction of frequent users—especially those using ChatGPT’s Advanced Voice Mode—showed notable emotional connection, with some even calling the AI a friend.
Second study: Controlled trial
The second study was a four-week controlled trial with about 1,000 individuals. The purpose of this Institutional Review Board-approved study was to ascertain the potential effects of ChatGPT’s various features, such as text versus voice conversations, on users’ wellbeing. In order to investigate the effects on social relationships, loneliness, emotional reliance, and potentially problematic technology use, participants were randomly allocated to different model configurations.
The study found that while short-term voice mode use was associated with better emotional health, regular use over an extended period of time may have the reverse effect. Notably, the study discovered that, in contrast to more neutral, task-focused interactions, personal conversations—those in which users and the AI engaged in more emotionally charged dialogue—were linked to higher degrees of loneliness.
Emotional dependence on the chatbot
Why the inconsistent outcomes? Well, that depends on you. Non-personal discussions may unintentionally increase emotional reliance on the chatbot, especially among frequent users, according to the research. Conversely, people who were already inclined to be attached in relationships and thought of the AI as a friend were more likely to suffer from the consequences. This implies that individual characteristics, such emotional needs and initial loneliness, can have a big impact on how AI interactions impact wellbeing.
The majority of users are notably safe. ChatGPT is typically asked for spreadsheet assistance rather than introspection. Nonetheless, the research identifies a specific demographic: frequent voice users who depend on the bot for emotional support. Imagine voice conversations late at night about existential dread or excessive disclosure of personal drama. Even though they were uncommon, these individuals displayed more severe symptoms of reliance and loneliness.
An automated classifier set named ‘EmoClassifiersV1’ was created by the researchers in order to evaluate these patterns. Using a vast language model, these tools were created to identify particular emotive indicators in discussions. Sub-classifiers that concentrated on more subtle facets of both user and AI messages and top-level detectors that recognized broad themes like loneliness and problematic use comprised the two tiers of classifiers. By using a dual strategy, the team was able to evaluate millions of interactions quickly while protecting user privacy and picking up on minor behavioral cues.
The takeaway
The study thus calls into question the appropriate behavior of “human-like” AI. The speech mode of ChatGPT, which aims to be captivating, treads carefully. A 10-minute conversation could benefit from energetic banter, while lengthy conversations could backfire. According to the study, the chatbot is attempting to mix helpfulness with prods toward healthier usage by utilizing these information to improve its models.
But “AI bad” isn’t the only lesson to be learned here. Most people still view ChatGPT as a tool rather than a therapist. The actual lesson? Setting limits is important. It might be time to call a human buddy if your nighttime ritual consists of spending hours discussing philosophy with a chatbot. AI isn’t meant to replace relationships, as the study drearily points out, but if you do, don’t expect a happy conclusion.
The paper emphasizes that this is preliminary research and not a conclusion. To clarify how AI interactions influence—or distort—our social life, the teams intend to do additional research. For the time being, their suggestion is straightforward: do not use ChatGPT as a diary, but rather as a Swiss Army knife. What if you find yourself referring to it as “mate”? Perhaps log out and call a real person.
Tech
Apple iOS 18: Features, Release Date, Leaks, and Compatible Devices (2025 Update)

Apple iOS 18: All of the Current Information (2025 Update)
With a strong emphasis on personalization, privacy features, and AI integration, Apple’s iOS 18 is expected to be the most sophisticated iPhone operating system to date. Leaks and reports have begun to construct a picture of what iOS 18 might deliver as the WWDC 2025 conference draws near.
We’ll go over the most recent information, leaked features, anticipated release date, and iOS 18 compatible devices in this post.
Date of Expected Release of iOS 18
Every June, Apple typically announces the latest iterations of iOS at the WWDC (Worldwide Developers Conference).
The anticipated dates for WWDC 2025 are June 10–14.
Developer Beta for iOS 18: June 2025
July 2025 is the public beta.
September 2025 is the final release date (with iPhone 17).
iOS 18’s Top Leaked & Anticipated Features
1. AI-Powered Siri Update
Siri is anticipated to acquire intelligence similar to ChatGPT.
Natural dialogue and improved context management
2. More Intelligent Alerts
AI will compile and arrange alerts.
Quiet hours and priority warnings
3. Widgets for Interactive Home Screens
Finally, users may operate apps straight from widgets.
4. A New Control Center
More personalization choices and a revamped design
5. The Dashboard for Enhanced Privacy
View the data that apps utilize and receive weekly reports.
6. AI-Powered Smarter Auto-Reply
Auto-generated responses for emails, texts, and phone calls
7. Context-Based App Suggestions
Depending on the time, place, or usage pattern
8. New Options for the Lock Screen
More information on the screen and deeper customisation
9. Improved Battery Control
Improved control of background activity
Devices Supported by iOS 18 (Anticipated)
All smartphones running iOS 17 will be supported by iOS 18, if reports are to be believed. These consist of:
The iPhone 15 Series
The iPhone 14 Series
iPhone 13 Series
The iPhone 12 Series
The iPhone 11 Series
iPhone SE (second generation and later)
Support for older models, such as the iPhone X or iPhone 8, may be discontinued.
How to Access the Early iOS 18 Beta
1. Register at https://beta.apple.com for Apple’s Beta Software Program.
2. Set up the configuration profile.
3. Make an iPhone backup
4. Await the June/July 2025 beta release.
—
With its emphasis on artificial intelligence, user control, and data protection, Apple iOS 18 is poised to revolutionize the industry. iOS 18 is worth seeing whether you’re a developer, tech enthusiast, or just a regular consumer.
As the WWDC 2025 keynote approaches, bookmark this page for the most recent information.
Tech
WhatsApp Is Balancing Privacy and AI Features

A new “Private Processing” mechanism will be used by WhatsApp’s AI technologies to provide cloud access without exposing end-to-end encrypted messages to Meta or anybody else. However, specialists continue to identify hazards.
In the upcoming weeks, cloud-based AI capabilities will be introduced by WhatsApp, an end-to-end encrypted communication app that is used by approximately 3 billion people worldwide. These capabilities are intended to maintain WhatsApp’s fundamental security and privacy guarantees while providing users with tools for message composition and summarization.
Llama, an open source big language model, is the foundation for the generative AI features that Meta has been integrating into its services. A light blue circle that provides users with access to the Meta AI assistant is already incorporated into WhatsApp. However, since interactions with the AI assistant aren’t protected from Meta like end-to-end encrypted WhatsApp communications are, a lot of users have objected to this inclusion. With what the business claims is a carefully designed and purpose-built platform dedicated to processing data for AI activities without the information being accessible to Meta, WhatsApp, or any other party, the new feature, called Private Processing, is intended to allay these worries.Although the scheme’s integrity has received favorable first assessments from researchers, others point out that WhatsApp may eventually go downhill as a result of its shift to AI features.
Many different researchers and threat actors target and examine WhatsApp. This indicates that its threat model is well known internally, according to Chris Rohlf, director of security engineering at Meta. “This wasn’t just about managing the expansion of that threat model and making sure the expectations for privacy and security were met—it was about carefully considering the user experience and making this opt-in because there’s also an existing set of privacy expectations from users.”
Only the sender and the recipient, or the participants in a group chat, can access end-to-end encrypted conversations. By design, the service provider—in this example, WhatsApp and its parent firm Meta—is confined and unable to access customers’ calls or messages. Typical generative AI platforms, which require access to user requests and data in order to compute huge language models on cloud servers, cannot work with this configuration. The objective of Private Processing is to develop a different framework that integrates AI while maintaining the privacy and security assurances of end-to-end encrypted communication.Users can choose to use WhatsApp’s AI features, and by activating a new WhatsApp setting called “Advanced Chat Privacy,” they can also stop other users from using the AI features in shared discussions.
WhatsApp stated in a blog post last week that “you can block others from exporting chats, auto-downloading media to their phone, and using messages for AI features when the setting is on.” Participants just need to be aware of any changes they make because, similar to disappearing messages, anyone in a chat can switch Advanced Chat Privacy on and off, which is recorded for everyone to view.Sensitive data is isolated in a “Trusted Execution Environment,” a locked-down, isolated area of a processor, thanks to specialized hardware used in private processing. The system is designed to process and store data for as little time as possible. If it detects any tampering or changes, it will grind to a halt and issue an alert. In order to encourage the security community to report bugs and potential vulnerabilities, WhatsApp is already allowing third parties to audit various system components and will join the Meta bug bounty program.In the end, Meta adds, it intends to make the Private Processing components open source in order to facilitate the development of comparable services by others and to allow for further validation of its security and privacy assurances.
For its Apple Intelligence AI platform, Apple introduced a comparable program last year called Private Cloud Compute. Additionally, users can activate the service in Messages, Apple’s end-to-end encrypted communication software, to create “Smart Reply” messages on Macs and iPhones and produce message summaries.
Comparing private processing versus private cloud computing is like comparing, well, apples and oranges. All applications of Apple Intelligence are supported by Apple’s Private Cloud Compute. Conversely, Private Processing was developed specifically for WhatsApp and does not support Meta’s AI capabilities in general. Additionally, Apple Intelligence is built to conduct as much AI as possible on-device and only access the Private Cloud Compute infrastructure as required. Apple only created Apple Intelligence to function at all on its most recent generations of mobile hardware because such “on device” or “local” processing necessitates sophisticated hardware. Apple Intelligence will never be supported by outdated iPads and iPhones.
While Meta is a software firm with around 3 billion users who own a variety of smartphones, including low-end and outdated models, Apple is a maker of high-end smartphones and other hardware. Designing AI features for WhatsApp that could operate locally on the variety of devices WhatsApp services was not practical, according to Rohlf and Colin Clemmons, one of the main engineers for Private Processing. Rather, WhatsApp concentrated on making Private Processing as useless as possible for hackers in the event that it was compromised.
According to Clemmons, “the design is one of risk minimization.” “The value of jeopardizing the system should be minimized.”
However, the entire endeavor begs the more fundamental question of whether a secure messaging app such as WhatsApp should even have AI capabilities. However, users now anticipate the features and will do whatever it takes to get them, according to Meta.
In an email to WIRED, WhatsApp CEO Will Cathcart stated, “A lot of people want to use AI tools to help them when they are messaging.” “We believe that creating a private method to accomplish that is crucial, as users shouldn’t have to move to a less private platform in order to access the features they require.”
A pure end-to-end system will be less dangerous than any end-to-end encrypted system that makes use of off-device AI inference. Matt Green, a cryptographer at Johns Hopkins who hasn’t evaluated the entire system but has seen some of the privacy promises of Private Processing, says, “You’re sending data to a computer in a data center, and that machine sees your private texts.” “When WhatsApp claims that they have made this as safe as possible and that they are unable to read your texts, I believe them. However, I believe there are dangers involved as well. More private information will be lost, and hackers and nation-state enemies will target the computers that handle this data.
WhatsApp also claims that Private Processing would presumably lay the groundwork for future developments of more complex and involved AI features that need processing and maybe storing more data, going beyond simple AI functions like text summarization and writing recommendations.
According to Green, “any and all of this will make the Private Processing computers into a very big target given all the crazy things people use secure messengers for.”
Tech
GTA 6 was postponed; Rockstar has revealed the release date.

Grand Theft Auto VI, one of the year’s most anticipated video game releases, will now debut on May 26, 2026, according to Rockstar Games. After the first official teaser was released in 2023, it was previously anticipated that GTA 6 will be released in 2025.
The studio publicly addressed the delay in a statement released on May 2 and thanked fans for their continued patience and support. Although Rockstar admitted that some people might be disappointed by the revised release window, the company maintained that the extra time is required to live up to the anticipation for the upcoming installment in its flagship franchise.
When the corporation wrote, “We are very sorry that this is later than you expected,” “We hope you understand that we need this extra time to deliver at the level of quality you expect and deserve.”
The statement comes after there have been little updates since the game’s initial teaser, leading to increasing online rumors regarding a possible delay. Grand Theft Auto VI will be available for the PlayStation 5 and Xbox Series X/S systems, according to Rockstar.
According to Rockstar, GTA 6 takes place in the made-up state of Leonida, which is sometimes thought of as a redesigned Florida. According to the company, it is “the biggest, most immersive evolution of the Grand Theft Auto series yet.”
Since Grand Theft Auto V’s historic success—it has sold over 210 million copies since its 2013 release—anticipation for GTA VI has remained high. The long-lasting appeal of that game contributed to the belief that its follow-up would establish a new standard for open-world gaming. Although Rockstar has not yet disclosed any GTA VI gameplay specifics, the studio promised to do so “soon.”
-
Sports2 months ago
Messi comes back and scores in less than two minutes.
-
Sports2 months ago
They will make IPL a hit
-
Entertainment2 months ago
Why did Juhi reject Salman?
-
Fashion2 months ago
Before getting your ears pierced, here are some things to consider
-
World2 months ago
Israel continues its Gaza attack, killing a journalist and issuing evacuation orders.
-
Tech2 months ago
Xiaomi brings Redmi Note 14
-
Entertainment2 months ago
Due to his mental health issues, David Kushner has cancelled his tour.
-
Fashion2 months ago
Why you should add deshi products to your Eid shopping list