https://www.youtube.com/watch?v=9eB7ciQnvb0
An urban legend tells the tale of Google promising a cool feature for its Photos application: the magic ability to turn your old black-and-white photos into colorful pics. Google’s magnificent dataset of photos would help its AI engine figure out what a certain shade of grey should have been, originally. So your grandpa’s blue shirt would become blue again, and the tree behind him would return to its green shade.
Sadly, Google has yet to release this feature for its Photos application (it briefly tested it then took it back). In the meantime, Samsung has already taken the leap and added a Colorize option as part of its Galaxy AI features on the Galaxy S24 series. And it’s incredibly easy to use, but is it any good, for real? Can you rely on Samsung to turn your black-and-white photos into a colorful snap? I did some extensive testing.
You can only colorize what Samsung lets you colorize
Colorizing photos is an option Samsung reserves to itself, in a way. It’s not part of the generative edit magic button, and can only access it in Samsung’s Gallery app. It’s one of the smart suggestions that show up as a chip (on the bottom left of the photo) when you swipe up on a pic to see its details.
Once you hit the Colorize button, it only takes a few seconds for the Galaxy S24 to pop up the result. Samsung shows a before/after slider, so you can check the result before saving it (or saving it as a copy). The best part is that sometimes it understands that there are focal points in your photo, like in this collage of two of my photos. It gave me the option to zoom in on both faces to see the before/after transformation. Cool.
However, since this is a smart suggestion, I can’t force it on photos that Samsung doesn’t deem worthy of colorizing. I tested the feature with about 30 black-and-white photos and the Colorize suggestion popped up on all of them except the two below. Both sushi photos, for the fun coincidence. Since I didn’t get the option here, I can’t force colorize these pics in any other way. So my sushi platters will remain colorless for now.
Colorizing people and pets
I started my tests with photos of those we love the most — people and pets. Speaking from a personal perspective, the photos I’d want to colorize would be of my grandmas and grandpas, and the early childhood pics of my parents. We have dozens of albums’ worth of black-and-white snaps that could use an extra sparkle like this.
But since I don’t know the real colors of those photos and since I wanted to test Samsung’s ability to recreate those, I decided to run my test on photos where I have the originals. So I took some photos from my library, dropped the saturation down to zero, made them black-and-white, saved that as a new photo, and transferred them to the Galaxy S24.
In the first samples below, you’ll see how Samsung deals with people and pets. In general, it nailed what should be a skin tone (though the side of my face in that collage remained blue), the pink hue of lips, and the color of grass in the background of the dog’s photo.
I would say my husband’s photo above is the most realistic colorization of the bunch. Yes, it failed at getting the colorful ceiling panels above him, but the face and shirt look good if a little cold. I could add a bit of warmth to that photo and you wouldn’t be able to tell it started as a black and white photo.
In general, Samsung nailed what the colors of skin, hair, lips, and tongues.
The other two photos above exhibit a filter-like effect. The saturation is too weak, my indigo sweater looks almost black, and the grass beyond the dog is too pale. Given Samsung’s propensity to over-saturate its own camera photos, it’s weird to see it go the other way in this colorization exercise.
My biggest disappointment is in the Samoyed’s colorization. A proper AI should recognize the dog’s breed and know they’re white, not pink-purple. This shouldn’t be a question.
Bringing landscapes back to life
Moving on to landscapes, I had high hopes. This is the exact opposite of people: Given an extensive dataset of photos and locations, colorizing a greyscale landscape should be kids’ play for an AI engine.
The Galaxy S24 disappointed me a bit here. The first two landscapes of the Swiss mountains and Lake Bled, respectively, turned up too warm and vintage.
The third photo is the best colorization result I’ve seen among all 30+ images I tested. It looks nearly perfect, and both the blue and green hues are quite close to the original pic’s colors.
Adding some colors to flowers
My disappointment continued with photos of flowers. Once again, given a proper dataset, the AI should know the flower’s exact species and figure out exactly what color it should be. In my tests, that wasn’t the case.
I expected Samsung's AI to know the flower's exact species and colorize it accordingly. It didn't.
The first colorization is fine, until you notice the pink center of the hibiscus is nowhere to be found. But the other two are more of a crime against botany everywhere. Gone is the blue-purple of the globe thistle, replaced by a warm yellow-green shade. The orange and yellow of the Peruvian lily are barely colorized into a blue-green tint. There aren’t many colors of these flowers, so Samsung shouldn’t get them wrong.
Colorizing food is hit-and-miss
If you were already thinking that there isn’t a lot of extra “intelligence” going on behind this feature, this should seal the deal. While the burger and fries photo is fine (and probably the second best after the green and blue landscape above), the fruit bowl is an absolute disgrace if you ask me.
Samsung turned strawberries, blueberries and bananas into three shades of brown.
Even in the black and white photo, you can tell these are bananas, blueberries, and strawberries. You can’t turn them into three shades of brown!
The same is true of the pizza photo, where the pink of the ham is brought back as a boring light brown mush. I’d forgive missing the red of the platter because there’s nothing to hint at it in the original photo, but the pink ham should be an easy one.
From simple scenes to challenging colorful scenes
I knew I was hitting a wall with this feature, but I decided to push it a bit more. First with a few simple scenes that it handled rather fine. It didn’t choose the right color tone in the first two underground and cellar photos, but the result is realistic. I’ll also give it some extra points for recognizing the white and red of the lighthouse.
Things became tougher as I went on. Photos with multiple colors result in a boring single-hue colorization. I don’t blame Samsung for not knowing the exact colors here, but it’s proof that the AI has limits. It sees shades of grey and figures out a middle hue value that makes sense. It isn’t using geolocation or a huge public dataset to colorize photos.
Samsung isn't using geolocation or a huge public photo dataset to colorize photos.
Case in point: Bilbao’s building (first photo) and Le Havre’s Catène de Containers (third pic) become orange mush and whatever that blue-green’ish shade is for the shipping containers.
Pushing further, I tested the Colorize option on extremely busy photos and the results were a bit bad in the first two samples (too vintage-feeling and desaturated), while the third one nailed the trees and lights colors quite well.
Red is the toughest color
You might have noticed that in many of the examples above, the Galaxy S24’s colorization engine is often choosing to veer away from reds. The only times it really added some reddish hues were for pink lips and the lighthouse. For the flowers, strawberries, pizza ham, and many other red elements, it just ignored the red color. This is another example. The red lighthouse is turned into a blueish grey.
Verdict: Can Samsung really colorize your black-and-white photos?
Credit: Rita El Khoury / Android Authority
For photos of people, I’m convinced that Samsung will do a good job of bringing those old black-and-white snaps into life. All of the examples I tested (including the ones I didn’t share here for privacy purposes) got a good skin tone, proper pink for the lips, and decent enough hair coloring. Now if you have red hair or blue eyes, lower your expectations. The Galaxy S24 might not guess that.
For all other photos, the results are mitigated. I expected Samsung to use different factors (subject of the photo, location, and its own image dataset) to get some color back into photos. But as shown by the examples of the dog, flowers, different landscapes, and those brown strawberries and blueberries — sorry, I just can’t forget those! — that isn’t the case.
I suspect that Samsung's AI, in search of speed, is dividing photos into specific areas, and sometimes lumping the entire pic into one.
Plus, I have a suspicion that Samsung’s AI engine, in search of speed, is attempting to divide any photo into specific areas before it colorizes it. Most times the whole photo ends up lumped together and gets the same treatment.
You’ll see this in colorized photos that are basically just a single tone (sea port lighthouse above, Bilbao buildings, fruit bowl, pizza, flowers photos, most of the landscapes, and the dog photo). And that tone is often too warm or too cold, not saturated enough, and definitely too vintage filter-like.
In some photos, though, you get two or three different zones and a different tone between each. These are the ones that are the closest to reality and the original photo. This is true for the first lighthouse photo, the green and blue river landscape, and the burger. Those are my favorites of the bunch.
For old photos of people, you don't know what you're missing on, but you know what you're gaining. The feature is a win then.
In conclusion, I’d personally try this feature on photos of people, and I’d expect it to do things mostly right. If I’m colorizing super old photos, I won’t know what colors I’m missing on, but I’ll know what I’m gaining. So it’s a win. For anything else, it’s a coin toss. I wouldn’t waste my time. Keep the vintage black-and-white instead of a bad yellow or blue filter.