FreshRSS

Zobrazení pro čtení

Jsou dostupné nové články, klikněte pro obnovení stránky.

Swapping genes can help fruit flies regenerate cells

While humans won’t be regenerating entire limbs like sea stars, some new genetic work with fruit flies has yielded some surprising results. A team from the University of Tokyo found that certain genes from simple organisms that help them regenerate body parts and tissues can be transferred into other animals. These genes then suppressed an intestinal issue in the flies and could potentially reveal some new mechanisms for rejuvenation in more complex organisms. The findings are detailed in a study published August 1 in the journal BMC Biology.

[Related: These fingernail-sized jellyfish can regenerate tentacles—but how?]

Some animals including jellyfish and flatworms can regenerate their whole bodies. While scientists still don’t really know how, there are possibly specific genes that allow regeneration. These same genes may also maintain long-term stem cell functions.

Stem cells can divide and renew themselves over a long period of time and are kind of like a skeleton key. While they aren’t necessarily specialized, they can potentially become more specialized cells, including blood cells and brain cells, over time. Mammals and insects who have very limited regenerative skills may have lost these genes over the course of evolution. 

“It is unclear whether reintroducing these regeneration-associated genes in low regenerative animals could affect their regeneration and aging processes,” study co-author and University of Tokyo Graduate School of Pharmaceutical Sciences biologist Yuichiro Nakajima said in a statement.

In this new study, Nakajima and the team focused on the group of genes that is unique to animals with high regenerative capacity like flatworms. These genes are called HRJDs, or highly regenerative species-specific JmjC domain-encoding genes. They transferred the HRJDs into the fruit fly (Drosophila melanogaster) and tracked their health with a blue dye. They nicknamed the fly Smurf, thanks to this hue. 

two fruit flies under a microscope. one is injected with a blue dye and has a blue color
Researchers track the intestinal health of fruit flies with a blue dye, hence the name Smurf. Fruit fly intestines damaged by aging leak the blue dye, this image shows an HRJD-modified fly on the left and an unmodified fly of the same age on the right. CREDIT: ©2024 Hiroki Nagai CC-BY-ND.

Initially, they hoped that these HRJD-boosted fruit flies would regenerate tissue if injured. This didn’t happen. However, the team had a fruit fly intestine expert Hiroki Nagai onboard, who noticed something else. There were some novel phenotypes–or the characteristics like eye color or hair color that comes from a specific gene.  

“HRJDs promoted greater intestinal stem cell division, whilst also suppressing intestinal cells that were mis-differentiating, or going wrong in aged flies,” said Nakajima. 

This is different to how antibiotics may suppress the mis-differentiated intestinal cells, but suppress intestinal stem cell division. 

[Related: Hydras can regrow their heads. Scientists want to know how they do it.]

“For this reason, HRJDs had a measurable effect on the lifespans of fruit flies, which opens the door, or at least provides clues, for the development of new anti-aging strategies,” said  Nakajima. “After all, human and insect intestines have surprisingly much in common on a cellular level.”

Fruit flies are famous test subjects in biological research. They share 75 percent of the genes that cause diseases in humans, reproduce quickly, and their genetic code is fairly easy to change. However, even with their relatively short lives and rapid-fire reproduction and maturating rates, it still took about two months to study their full aging process. 

The left two images show intestinal proteins disrupted by aging, and those on the right show the same proteins better preserved against age-related mechanisms due to the HRJD genes. CREDIT:  ©2024 Hiroki Nagai CC-BY-ND.

In future studies, the team would like to take a closer look at how HRJD’s work on a molecular level. 

“Details of the molecular workings of HRJDs are still unresolved. And it’s unclear whether they work alone or in combination with some other component,” said Nakajima. “Therefore, this is just the start of the journey, but we know now that our modified fruit flies can serve as a valuable resource to uncover unprecedented mechanisms of stem cell rejuvenation in the future. In humans, intestinal stem cells decrease in activity with age, so this research is a promising avenue for stem cell-based therapies.”

The post Swapping genes can help fruit flies regenerate cells appeared first on Popular Science.

Best apps for photo editing on Windows in 2024

Capturing and saving memories is easier now than it's ever been before. Most modern smartphones have very capable cameras, and they're small enough to fit into a pocket. But as good as these cameras are, there's always something you might want to tweak, crop, or change entirely, and photo editing apps let you do just that. While there are apps for editing on smartphones, computers still offer more capable experiences most of the time, and we've rounded up some of the best apps you can use for photo editing on Windows.

The Galaxy S24 colorized my black-and-white photos and made strawberries brown

https://www.youtube.com/watch?v=9eB7ciQnvb0

An urban legend tells the tale of Google promising a cool feature for its Photos application: the magic ability to turn your old black-and-white photos into colorful pics. Google’s magnificent dataset of photos would help its AI engine figure out what a certain shade of grey should have been, originally. So your grandpa’s blue shirt would become blue again, and the tree behind him would return to its green shade.

Sadly, Google has yet to release this feature for its Photos application (it briefly tested it then took it back). In the meantime, Samsung has already taken the leap and added a Colorize option as part of its Galaxy AI features on the Galaxy S24 series. And it’s incredibly easy to use, but is it any good, for real? Can you rely on Samsung to turn your black-and-white photos into a colorful snap? I did some extensive testing.

You can only colorize what Samsung lets you colorize

Colorizing photos is an option Samsung reserves to itself, in a way. It’s not part of the generative edit magic button, and can only access it in Samsung’s Gallery app. It’s one of the smart suggestions that show up as a chip (on the bottom left of the photo) when you swipe up on a pic to see its details.

Once you hit the Colorize button, it only takes a few seconds for the Galaxy S24 to pop up the result. Samsung shows a before/after slider, so you can check the result before saving it (or saving it as a copy). The best part is that sometimes it understands that there are focal points in your photo, like in this collage of two of my photos. It gave me the option to zoom in on both faces to see the before/after transformation. Cool.

However, since this is a smart suggestion, I can’t force it on photos that Samsung doesn’t deem worthy of colorizing. I tested the feature with about 30 black-and-white photos and the Colorize suggestion popped up on all of them except the two below. Both sushi photos, for the fun coincidence. Since I didn’t get the option here, I can’t force colorize these pics in any other way. So my sushi platters will remain colorless for now.

Colorizing people and pets

I started my tests with photos of those we love the most — people and pets. Speaking from a personal perspective, the photos I’d want to colorize would be of my grandmas and grandpas, and the early childhood pics of my parents. We have dozens of albums’ worth of black-and-white snaps that could use an extra sparkle like this.

But since I don’t know the real colors of those photos and since I wanted to test Samsung’s ability to recreate those, I decided to run my test on photos where I have the originals. So I took some photos from my library, dropped the saturation down to zero, made them black-and-white, saved that as a new photo, and transferred them to the Galaxy S24.

In the first samples below, you’ll see how Samsung deals with people and pets. In general, it nailed what should be a skin tone (though the side of my face in that collage remained blue), the pink hue of lips, and the color of grass in the background of the dog’s photo.

I would say my husband’s photo above is the most realistic colorization of the bunch. Yes, it failed at getting the colorful ceiling panels above him, but the face and shirt look good if a little cold. I could add a bit of warmth to that photo and you wouldn’t be able to tell it started as a black and white photo.

In general, Samsung nailed what the colors of skin, hair, lips, and tongues.

The other two photos above exhibit a filter-like effect. The saturation is too weak, my indigo sweater looks almost black, and the grass beyond the dog is too pale. Given Samsung’s propensity to over-saturate its own camera photos, it’s weird to see it go the other way in this colorization exercise.

My biggest disappointment is in the Samoyed’s colorization. A proper AI should recognize the dog’s breed and know they’re white, not pink-purple. This shouldn’t be a question.

Bringing landscapes back to life

Moving on to landscapes, I had high hopes. This is the exact opposite of people: Given an extensive dataset of photos and locations, colorizing a greyscale landscape should be kids’ play for an AI engine.

The Galaxy S24 disappointed me a bit here. The first two landscapes of the Swiss mountains and Lake Bled, respectively, turned up too warm and vintage.

The third photo is the best colorization result I’ve seen among all 30+ images I tested. It looks nearly perfect, and both the blue and green hues are quite close to the original pic’s colors.

Adding some colors to flowers

My disappointment continued with photos of flowers. Once again, given a proper dataset, the AI should know the flower’s exact species and figure out exactly what color it should be. In my tests, that wasn’t the case.

I expected Samsung's AI to know the flower's exact species and colorize it accordingly. It didn't.

The first colorization is fine, until you notice the pink center of the hibiscus is nowhere to be found. But the other two are more of a crime against botany everywhere. Gone is the blue-purple of the globe thistle, replaced by a warm yellow-green shade. The orange and yellow of the Peruvian lily are barely colorized into a blue-green tint. There aren’t many colors of these flowers, so Samsung shouldn’t get them wrong.

Colorizing food is hit-and-miss

If you were already thinking that there isn’t a lot of extra “intelligence” going on behind this feature, this should seal the deal. While the burger and fries photo is fine (and probably the second best after the green and blue landscape above), the fruit bowl is an absolute disgrace if you ask me.

Samsung turned strawberries, blueberries and bananas into three shades of brown.

Even in the black and white photo, you can tell these are bananas, blueberries, and strawberries. You can’t turn them into three shades of brown!

The same is true of the pizza photo, where the pink of the ham is brought back as a boring light brown mush. I’d forgive missing the red of the platter because there’s nothing to hint at it in the original photo, but the pink ham should be an easy one.

From simple scenes to challenging colorful scenes

I knew I was hitting a wall with this feature, but I decided to push it a bit more. First with a few simple scenes that it handled rather fine. It didn’t choose the right color tone in the first two underground and cellar photos, but the result is realistic. I’ll also give it some extra points for recognizing the white and red of the lighthouse.

Things became tougher as I went on. Photos with multiple colors result in a boring single-hue colorization. I don’t blame Samsung for not knowing the exact colors here, but it’s proof that the AI has limits. It sees shades of grey and figures out a middle hue value that makes sense. It isn’t using geolocation or a huge public dataset to colorize photos.

Samsung isn't using geolocation or a huge public photo dataset to colorize photos.

Case in point: Bilbao’s building (first photo) and Le Havre’s Catène de Containers (third pic) become orange mush and whatever that blue-green’ish shade is for the shipping containers.

Pushing further, I tested the Colorize option on extremely busy photos and the results were a bit bad in the first two samples (too vintage-feeling and desaturated), while the third one nailed the trees and lights colors quite well.

Red is the toughest color

You might have noticed that in many of the examples above, the Galaxy S24’s colorization engine is often choosing to veer away from reds. The only times it really added some reddish hues were for pink lips and the lighthouse. For the flowers, strawberries, pizza ham, and many other red elements, it just ignored the red color. This is another example. The red lighthouse is turned into a blueish grey.

Verdict: Can Samsung really colorize your black-and-white photos?

samsung galaxy s24 ultra colorize in gallery 2

Credit: Rita El Khoury / Android Authority

For photos of people, I’m convinced that Samsung will do a good job of bringing those old black-and-white snaps into life. All of the examples I tested (including the ones I didn’t share here for privacy purposes) got a good skin tone, proper pink for the lips, and decent enough hair coloring. Now if you have red hair or blue eyes, lower your expectations. The Galaxy S24 might not guess that.

For all other photos, the results are mitigated. I expected Samsung to use different factors (subject of the photo, location, and its own image dataset) to get some color back into photos. But as shown by the examples of the dog, flowers, different landscapes, and those brown strawberries and blueberries — sorry, I just can’t forget those! — that isn’t the case.

I suspect that Samsung's AI, in search of speed, is dividing photos into specific areas, and sometimes lumping the entire pic into one.

Plus, I have a suspicion that Samsung’s AI engine, in search of speed, is attempting to divide any photo into specific areas before it colorizes it. Most times the whole photo ends up lumped together and gets the same treatment.

You’ll see this in colorized photos that are basically just a single tone (sea port lighthouse above, Bilbao buildings, fruit bowl, pizza, flowers photos, most of the landscapes, and the dog photo). And that tone is often too warm or too cold, not saturated enough, and definitely too vintage filter-like.

In some photos, though, you get two or three different zones and a different tone between each. These are the ones that are the closest to reality and the original photo. This is true for the first lighthouse photo, the green and blue river landscape, and the burger. Those are my favorites of the bunch.

For old photos of people, you don't know what you're missing on, but you know what you're gaining. The feature is a win then.

In conclusion, I’d personally try this feature on photos of people, and I’d expect it to do things mostly right. If I’m colorizing super old photos, I won’t know what colors I’m missing on, but I’ll know what I’m gaining. So it’s a win. For anything else, it’s a coin toss. I wouldn’t waste my time. Keep the vintage black-and-white instead of a bad yellow or blue filter.

❌