Forget generative AI — this is how the Galaxy S24 could better use AI to help me lead a healthier life
Share- Nishadil
- January 09, 2024
- 0 Comments
- 5 minutes read
- 8 Views
I had my annual physical last week, and while I thought I was doing better of late with my health and fitness, the blood work told a different story. The results indicated that I have high blood pressure, which is a shocking first for me. I intend on fixing it stat before my next check up. That’s where the upcoming Galaxy S24 could help me out.
Since artificial intelligence is poised to be the defining features for all three Galaxy S24 models , everyone (including myself) expects it to offer AI assisted features similar to those featured in the Pixel 8 — and perhaps much more. But rather than gaining some sort of generative AI feature for editing photos and video, I’d much rather Samsung focus its Galaxy AI efforts around health and wellness.
It’s an area that’s underserved, quite frankly, and the Galaxy S24 series could genuinely help people like myself to live healthier lifestyles. With the next Samsung Galaxy Unpacked right around the corner, it won’t be long before every feature powered by AI is detailed. Here’s how the Galaxy S24 could make me healthier this year with AI.
If you’re acquainted with image searching services like Google Lens , then you understand how the Galaxy S24 could lean on artificial intelligence to identify what's in images captured by its cameras. It would be fantastic if I could take a photo of my meal and all the foods I eat, which the phone could then analyze and provide me with nutritional information.
That way, I can tell if the foods I’m eating are bad for my high blood pressure. But you know what? I want such a feature to be smarter about how it identifies what I’m eating, especially when those cravings hit and I’m prowling the neighborhood fast food joints. Not only would I want an AI powered tool that analyzes what I’m about to scarf down, but I think the true power of artificial intelligence needs to be exploited by knowing if I'm having the grilled chicken sandwich from Chick fil A or Wendy’s — which it should gather based on the phone's GPS coordinates.
I shouldn’t have to dial in those details myself; AI should be smart enough to do it for me. While this would be all helpful for me, the interesting part is that this process of taking photos of meals and foods isn’t new at all. In fact, there are a handful of fitness and calorie counting apps that already let you take snapshots of your food in order to track down all the nutritional information you’re looking for.
For example, the MyFitnessPal app does exactly this — but there’s a caveat. This premium feature is locked behind a paywall, which costs $19.99 per month or $79.99 for the year. There’s a 1 month free trial if you’re inclined, but I can’t justify paying for yet another subscription. With the Galaxy S24, the rumors hint to the possibility that certain AI features could require a subscription .
I hope that’s not the case, mainly because it would set a precedent for other smartphone makers to do the same with their respective AI plans. The only problem with MyFitnessPal’s meal scanning feature is that while it can identify foods, you still need to tell it the portion amount. Again, I feel that AI could be smarter in how it can do this.
Don’t you hate it how advertisements can make some foods much bigger than what you get in real life? Take any fruit, which can come in all shapes and sizes. Taking a photo isn’t enough for AI to determine an accurate portion or size, and while I’d love to bring my digital weight scale around with me all the time to weigh my foods, it’s simply not practical.
There could be several ways to help AI determine portion sizes, but it would most likely be tied to the Galaxy S24 Ultra’s S Pen. If the stylus were placed in the photo alongside your snapped up foods, Galaxy AI could then be able to determine not only what you’re eating and all accompanying nutritional information, but potentially the portion size as well.
Size does matter, so having the S Pen included in the snapshot as a reference point could deliver a better readout of what you’re eating — down to the quantity and volume. Now that I’m at that age when I need to be more mindful about the things I eat, I can’t be bothered about second guessing whether this or that food is over my sodium level recommendation.
If Samsung’s really serious about AI, then it would be best served to help all of us lead healthier lives by making it super simple and intuitive to know what we’re eating..