Curio Cabinet / Daily Curio
-
FREESTEM Daily Curio #3098Free1 CQ
These dinosaurs might have been impressive to look at, but their table manners were awful. While most animals have to chew their food thoroughly, it seems that wasn’t the case for sauropods, some of the largest dinosaurs ever to walk the Earth. Based on a recently discovered fossil, scientists now believe that sauropods hardly chewed their food at all.
Sauropods were members of Sauropoda, a clade of enormous, long-necked, vegetarian dinosaurs. Yet, for a long time, scientists didn’t know many specifics about sauropod diets. Paleontologists made the assumption that they ate plants based on two factors: they had flat teeth, which are good for processing plant matter, and sauropods were huge, meaning that there was no feasible way for them to have depended on anything other than plants, much like large herbivores today. Besides, their gigantic bodies, long necks, and long tails would have made them clumsy hunters. Now, not only do we have confirmation that sauropods ate plants, we know quite a bit about how they did it.
Researchers discovered a cololite—fossilized intestinal contents—that belonged to Diamantinasaurus matildae, a species of sauropod that lived around 100 million years ago. By performing a CT scan on the cololite, they found that the remains were composed entirely of plant matter. The leaves of the fern-like plant were largely intact, suggesting that the sauropod barely chewed them before swallowing. This means that sauropods were probably bulk feeders, ingesting as much plant matter as possible and relying on the natural fermentation process inside their digestive systems to break down their food. It’s a more extreme version of what many herbivores do today. Cows and other ruminants rely on fermentation to digest their food, and they also spend much of their time ruminating, which means they regurgitate their food to chew it again. You really needed a strong stomach to live in the Cretaceous period.
[Image description: A black-and-white illustration of a long-necked sauropod dinosaur.] Credit & copyright: Pearson Scott Foresman, Wikimedia Commons. This work has been released into the public domain by its author, Pearson Scott Foresman. This applies worldwide.These dinosaurs might have been impressive to look at, but their table manners were awful. While most animals have to chew their food thoroughly, it seems that wasn’t the case for sauropods, some of the largest dinosaurs ever to walk the Earth. Based on a recently discovered fossil, scientists now believe that sauropods hardly chewed their food at all.
Sauropods were members of Sauropoda, a clade of enormous, long-necked, vegetarian dinosaurs. Yet, for a long time, scientists didn’t know many specifics about sauropod diets. Paleontologists made the assumption that they ate plants based on two factors: they had flat teeth, which are good for processing plant matter, and sauropods were huge, meaning that there was no feasible way for them to have depended on anything other than plants, much like large herbivores today. Besides, their gigantic bodies, long necks, and long tails would have made them clumsy hunters. Now, not only do we have confirmation that sauropods ate plants, we know quite a bit about how they did it.
Researchers discovered a cololite—fossilized intestinal contents—that belonged to Diamantinasaurus matildae, a species of sauropod that lived around 100 million years ago. By performing a CT scan on the cololite, they found that the remains were composed entirely of plant matter. The leaves of the fern-like plant were largely intact, suggesting that the sauropod barely chewed them before swallowing. This means that sauropods were probably bulk feeders, ingesting as much plant matter as possible and relying on the natural fermentation process inside their digestive systems to break down their food. It’s a more extreme version of what many herbivores do today. Cows and other ruminants rely on fermentation to digest their food, and they also spend much of their time ruminating, which means they regurgitate their food to chew it again. You really needed a strong stomach to live in the Cretaceous period.
[Image description: A black-and-white illustration of a long-necked sauropod dinosaur.] Credit & copyright: Pearson Scott Foresman, Wikimedia Commons. This work has been released into the public domain by its author, Pearson Scott Foresman. This applies worldwide. -
FREEMusic Appreciation Daily Curio #3097Free1 CQ
You’ll probably never hear someone sing it at a karaoke bar, but it’s still the most frequently-sung song in English. Happy Birthday is an indispensable part of birthday celebrations around the world, and the composer of the melody, Mildred J. Hill, was born this month in 1859 in Louisville, Kentucky. Hill came up with the now-famous tune in 1893, and the lyrics were written by her sister Patty, but the song they wrote wasn’t actually Happy Birthday. Instead, it was called Good Morning to All, and was meant to be sung by a teacher and their classroom. Patty was a pioneer in early childhood education. In fact, she is credited as the inventor of the modern concept of a kindergarten, and she sang Good Morning to All in her own classroom as a daily greeting.
The Hill sisters published Good Morning to All and other compositions in 1893’s Song Stories for the Kindergarten. Soon, the melody took on a life of its own. No one knows exactly how it happened, but the tune began to be used to wish someone a happy birthday. One credible account even credits the Hill sisters themselves, who were believed to have changed the lyrics during a birthday get-together they were attending. Regardless of how it happened, Happy Birthday began to spread. By the early 20th century, the song appeared in movies, plays, and even other songbooks without crediting the Hill sisters. Mildred passed away in 1916, and Patty passed away in 1946, neither being credited as the originators of Happy Birthday. Their youngest sister, Jessica Hill, took it upon herself to copyright the song and have the publisher of Song Stories for the Kindergarten re-release it in 1935. The rights to the song eventually went to another publishing company and for decades after, the rights to the song were privately held, which is why movies had to pay royalties to use it, and why restaurants wishing their patrons a happy birthday had to sing a proprietary or royalty-free song instead. Then, in 2013, the publishing company was taken to court with claims that the copyright to Happy Birthday had expired years earlier. Finally, in 2016, the song entered public domain. It’s a short and simple ditty, but its story is anything but.
[Image description: A birthday cake with lit candles in a dark setting.] Credit & copyright: Fancibaer, Wikimedia Commons. This file is made available under the Creative Commons CC0 1.0 Universal Public Domain Dedication.You’ll probably never hear someone sing it at a karaoke bar, but it’s still the most frequently-sung song in English. Happy Birthday is an indispensable part of birthday celebrations around the world, and the composer of the melody, Mildred J. Hill, was born this month in 1859 in Louisville, Kentucky. Hill came up with the now-famous tune in 1893, and the lyrics were written by her sister Patty, but the song they wrote wasn’t actually Happy Birthday. Instead, it was called Good Morning to All, and was meant to be sung by a teacher and their classroom. Patty was a pioneer in early childhood education. In fact, she is credited as the inventor of the modern concept of a kindergarten, and she sang Good Morning to All in her own classroom as a daily greeting.
The Hill sisters published Good Morning to All and other compositions in 1893’s Song Stories for the Kindergarten. Soon, the melody took on a life of its own. No one knows exactly how it happened, but the tune began to be used to wish someone a happy birthday. One credible account even credits the Hill sisters themselves, who were believed to have changed the lyrics during a birthday get-together they were attending. Regardless of how it happened, Happy Birthday began to spread. By the early 20th century, the song appeared in movies, plays, and even other songbooks without crediting the Hill sisters. Mildred passed away in 1916, and Patty passed away in 1946, neither being credited as the originators of Happy Birthday. Their youngest sister, Jessica Hill, took it upon herself to copyright the song and have the publisher of Song Stories for the Kindergarten re-release it in 1935. The rights to the song eventually went to another publishing company and for decades after, the rights to the song were privately held, which is why movies had to pay royalties to use it, and why restaurants wishing their patrons a happy birthday had to sing a proprietary or royalty-free song instead. Then, in 2013, the publishing company was taken to court with claims that the copyright to Happy Birthday had expired years earlier. Finally, in 2016, the song entered public domain. It’s a short and simple ditty, but its story is anything but.
[Image description: A birthday cake with lit candles in a dark setting.] Credit & copyright: Fancibaer, Wikimedia Commons. This file is made available under the Creative Commons CC0 1.0 Universal Public Domain Dedication. -
FREEEngineering Daily Curio #3096Free1 CQ
When it comes to engineering, there are always new uses for old standbys. Putting ice in your drink is a pretty rudimentary way to keep cool when it’s hot out, but Manhattan is putting a new twist on it by using ice to cool an entire building. Most modern air conditioners are a double-edged sword because, while they keep people comfortable and safe from extreme heat, they also consume a lot of electricity. As average global temperatures continue to rise, that puts more and more strain on city’s power grids, especially during peak daytime hours. The cooling system at New York City’s iconic Eleven Madison building is different. It does most of its work at night, when the city’s energy grid isn’t nearly as taxed.
Created by Trane Technologies, the system is called an ice battery. Every night, it uses electricity to freeze water into around 500,000 pounds of ice. During the day, the ice is used to cool the air being pushed through the building’s vents. Since electricity costs more to produce during peak hours, the system can lower energy bills by as much as 40 percent. The ice battery also drastically reduces the overall amount of energy used to cool the building, which is good news for the grid and the environment as a whole. If more buildings adopt ice batteries in the near future, it could reduce the need for more power plants to be built, even as the climate continues to warm. That’s less land and fewer resources that will have to be devoted to cooling buildings.
Of course, it still takes quite a bit of electricity to freeze ice, even at night. Research is already underway to see if chilled but unfrozen water might be a viable alternative. If enough buildings and homes are able to use such thermal energy storage systems to replace traditional HVAC systems, the environmental impact would be enormous, even though the new systems aren’t entirely carbon neutral. A step in the right direction is always better than a step back.
[Image description: A piece of clear ice with a jagged edge on top.] Credit & copyright: Dāvis Mosāns from Salaspils, Latvia. Flickr, Wikimedia Commons. This file is made available under the Creative Commons CC0 1.0 Universal Public Domain Dedication.When it comes to engineering, there are always new uses for old standbys. Putting ice in your drink is a pretty rudimentary way to keep cool when it’s hot out, but Manhattan is putting a new twist on it by using ice to cool an entire building. Most modern air conditioners are a double-edged sword because, while they keep people comfortable and safe from extreme heat, they also consume a lot of electricity. As average global temperatures continue to rise, that puts more and more strain on city’s power grids, especially during peak daytime hours. The cooling system at New York City’s iconic Eleven Madison building is different. It does most of its work at night, when the city’s energy grid isn’t nearly as taxed.
Created by Trane Technologies, the system is called an ice battery. Every night, it uses electricity to freeze water into around 500,000 pounds of ice. During the day, the ice is used to cool the air being pushed through the building’s vents. Since electricity costs more to produce during peak hours, the system can lower energy bills by as much as 40 percent. The ice battery also drastically reduces the overall amount of energy used to cool the building, which is good news for the grid and the environment as a whole. If more buildings adopt ice batteries in the near future, it could reduce the need for more power plants to be built, even as the climate continues to warm. That’s less land and fewer resources that will have to be devoted to cooling buildings.
Of course, it still takes quite a bit of electricity to freeze ice, even at night. Research is already underway to see if chilled but unfrozen water might be a viable alternative. If enough buildings and homes are able to use such thermal energy storage systems to replace traditional HVAC systems, the environmental impact would be enormous, even though the new systems aren’t entirely carbon neutral. A step in the right direction is always better than a step back.
[Image description: A piece of clear ice with a jagged edge on top.] Credit & copyright: Dāvis Mosāns from Salaspils, Latvia. Flickr, Wikimedia Commons. This file is made available under the Creative Commons CC0 1.0 Universal Public Domain Dedication. -
FREEBiology Daily Curio #3095Free1 CQ
What's the matter, cat got your head? Burmese pythons and other invasive species have been wreaking havoc in the Florida everglades for years, but it seems the local wildlife is starting to fight back. Burmese pythons are a particularly big problem in Florida. The snakes have no natural predators once fully grown, and they are prolific at multiplying. State officials have tried everything to get rid of the reptilian invaders, including declaring open season on the snakes and rewarding hunters for every one they bring in, but it seems that nothing can wipe them out completely. Meanwhile, pythons are capable of eating anything that can fit inside their surprisingly stretchy jaws, including other, native predators like alligators. For years, scientists have been keeping a keen eye on the state’s python population, and part of that includes strapping radio trackers on male pythons during breeding season. The males lead researchers to nests, so that eggs and female pythons can be removed.
Yet, when scientists rolled up to the location of one of these radio-tracked pythons recently, they didn't find a cozy love nest. Instead, they found the snake’s decapitated body, which weighed a whopping 52 pounds. After setting up a trail camera near the corpse, they found the culprit—a common bobcat happily munching away on the remains. This marks the first time that a bobcat has been known to take down a python, and it's all the more shocking considering the python's size. While bobcats have never been known to hunt and eat pythons, the snakes have been found with bobcat claws still inside them. This led scientists to believe that bobcats were unable to defend themselves against the snakes. On paper, it's obvious why—adult bobcats weigh around 30 to 40 pounds, while Burmese pythons can weigh around 200 pounds. Maybe nature has simply had enough, or maybe this cat was just particularly skilled at punching (or clawing) above its weight.
[Image description: A bobcat in tall grass from the chest up.] Credit & copyright: National Park Services, Asset ID: 8859334f-c426-41db-9049-96e7d5dd5779. Public domain: Full Granting Rights.What's the matter, cat got your head? Burmese pythons and other invasive species have been wreaking havoc in the Florida everglades for years, but it seems the local wildlife is starting to fight back. Burmese pythons are a particularly big problem in Florida. The snakes have no natural predators once fully grown, and they are prolific at multiplying. State officials have tried everything to get rid of the reptilian invaders, including declaring open season on the snakes and rewarding hunters for every one they bring in, but it seems that nothing can wipe them out completely. Meanwhile, pythons are capable of eating anything that can fit inside their surprisingly stretchy jaws, including other, native predators like alligators. For years, scientists have been keeping a keen eye on the state’s python population, and part of that includes strapping radio trackers on male pythons during breeding season. The males lead researchers to nests, so that eggs and female pythons can be removed.
Yet, when scientists rolled up to the location of one of these radio-tracked pythons recently, they didn't find a cozy love nest. Instead, they found the snake’s decapitated body, which weighed a whopping 52 pounds. After setting up a trail camera near the corpse, they found the culprit—a common bobcat happily munching away on the remains. This marks the first time that a bobcat has been known to take down a python, and it's all the more shocking considering the python's size. While bobcats have never been known to hunt and eat pythons, the snakes have been found with bobcat claws still inside them. This led scientists to believe that bobcats were unable to defend themselves against the snakes. On paper, it's obvious why—adult bobcats weigh around 30 to 40 pounds, while Burmese pythons can weigh around 200 pounds. Maybe nature has simply had enough, or maybe this cat was just particularly skilled at punching (or clawing) above its weight.
[Image description: A bobcat in tall grass from the chest up.] Credit & copyright: National Park Services, Asset ID: 8859334f-c426-41db-9049-96e7d5dd5779. Public domain: Full Granting Rights. -
FREEMind + Body Daily CurioFree1 CQ
Would you like some sandwich with those fries? For anyone enjoying a horseshoe sandwich, it’s a fair question. Invented in Springfield, Illinois, Horseshoe sandwiches are a spectacle to behold, and a point of Midwestern pride. These open-faced, oversized sandwiches have been round since the 1920s, yet they haven’t spread far beyond the town where they were first concocted.
A horseshoe sandwich is an open-faced sandwich on thick toast, also known as Texas toast. It most commonly features a beef burger patty, though a slice of thick ham is sometimes used instead. On top of the meat is a tall pile of french fries drenched in cheese sauce. Though some modern horseshoe sandwiches use nacho cheese, traditionally the cheese sauce is inspired by Welsh rarebit, a dish of sharp cheddar cheese mixed with mustard, ale, or Worcestershire sauce served on toast.
Welsh rarebit played an important role in the formation of the horseshoe sandwich. Supposedly, in 1928, the swanky Leland hotel in downtown Springfield, Illinois was trying to attract new customers. Management asked hotel chef Joe Schweska to come up with a new, intriguing menu item. Schweska asked his wife, who had Welsh heritage, what she thought he should put on the menu. She suggested a spin on Welsh rarebit, So Schweska added french fries and a slice of thick-cut ham to the dish. The rest is history.
Except it’s difficult to know if Schweska was truly the first to make the sandwich. Some say that it was a different Leland chef, Steve Tomko, who actually invented the sandwich, since he later went on to serve it at the Red Coach Inn. Other Springfield restaurants soon had their own versions too, with several crediting themselves as the originators. No need to argue—there’s plenty of credit (and fries) to go around.
[Image description: A white plate with a hamburger patty covered in fries and white cheese sauce.] Credit & copyright: Dirtmound, Wikimdia Commons. This work has been released into the public domain by its author, Dirtmound at English Wikipedia. This applies worldwide.Would you like some sandwich with those fries? For anyone enjoying a horseshoe sandwich, it’s a fair question. Invented in Springfield, Illinois, Horseshoe sandwiches are a spectacle to behold, and a point of Midwestern pride. These open-faced, oversized sandwiches have been round since the 1920s, yet they haven’t spread far beyond the town where they were first concocted.
A horseshoe sandwich is an open-faced sandwich on thick toast, also known as Texas toast. It most commonly features a beef burger patty, though a slice of thick ham is sometimes used instead. On top of the meat is a tall pile of french fries drenched in cheese sauce. Though some modern horseshoe sandwiches use nacho cheese, traditionally the cheese sauce is inspired by Welsh rarebit, a dish of sharp cheddar cheese mixed with mustard, ale, or Worcestershire sauce served on toast.
Welsh rarebit played an important role in the formation of the horseshoe sandwich. Supposedly, in 1928, the swanky Leland hotel in downtown Springfield, Illinois was trying to attract new customers. Management asked hotel chef Joe Schweska to come up with a new, intriguing menu item. Schweska asked his wife, who had Welsh heritage, what she thought he should put on the menu. She suggested a spin on Welsh rarebit, So Schweska added french fries and a slice of thick-cut ham to the dish. The rest is history.
Except it’s difficult to know if Schweska was truly the first to make the sandwich. Some say that it was a different Leland chef, Steve Tomko, who actually invented the sandwich, since he later went on to serve it at the Red Coach Inn. Other Springfield restaurants soon had their own versions too, with several crediting themselves as the originators. No need to argue—there’s plenty of credit (and fries) to go around.
[Image description: A white plate with a hamburger patty covered in fries and white cheese sauce.] Credit & copyright: Dirtmound, Wikimdia Commons. This work has been released into the public domain by its author, Dirtmound at English Wikipedia. This applies worldwide. -
FREEWorld History Daily Curio #3094Free1 CQ
What's smooth and shiny enough for jewelry but dangerous enough for battle? Obsidian, of course. The Aztecs used obsidian for everything from necklaces to weapons of war. Now, archaeologists know where and how they sourced much of the volcanic rock. Obsidian is formed in the scorching crucible of volcanoes. As a naturally-forming glass, it is hard, brittle, and comes in a variety of colors depending on the particular mineral composition, though it's usually black. Its most striking quality, though, is that it forms extremely sharp edges when chipped. The Aztecs and other Mesoamerican cultures took advantage of this and created intricate weapons using the glassy rock.
While stone weapons might sound primitive, their production and distribution was anything but. A recent study that looked at almost 800 obsidian pieces from the Aztec capital of Tenochtitlán has revealed that the versatile rock was brought there through an intricate trade network from far away. The researchers behind the study used portable X-ray fluorescence, which can identify the unique chemical composition of a given piece of obsidian to figure out where each of them came from. Most of the obsidian used by the Aztecs appears to have been sourced from Sierra de Pachuca, a mountain range around 60 miles from their capital and beyond their borders. This implies that the Aztecs were willing to engage in long-distance trade to obtain the precious resource. For the Aztecs and other Mesoamerican cultures, obsidian wasn't just a material to be made into weapons, but precious jewelry. Obsidian with green and gold coloration was particularly valued, and was known as "obsidian of the masters”. In the hands of expert craftsmen, the dangerous rocks could be transformed into delicate pieces worn by high-ranking individuals to show off their status. Obsidian was also used as inlays in sculptures and ceremonial weapons, with some pieces left as offerings for the dead to be buried with. At least the dead won't have to worry about accidentally cutting themselves.
[Image description: A piece of black obsidian on a wooden surface.] Credit & copyright: Ziongarage, Wikimedia Commons. This file is made available under the Creative Commons CC0 1.0 Universal Public Domain Dedication.What's smooth and shiny enough for jewelry but dangerous enough for battle? Obsidian, of course. The Aztecs used obsidian for everything from necklaces to weapons of war. Now, archaeologists know where and how they sourced much of the volcanic rock. Obsidian is formed in the scorching crucible of volcanoes. As a naturally-forming glass, it is hard, brittle, and comes in a variety of colors depending on the particular mineral composition, though it's usually black. Its most striking quality, though, is that it forms extremely sharp edges when chipped. The Aztecs and other Mesoamerican cultures took advantage of this and created intricate weapons using the glassy rock.
While stone weapons might sound primitive, their production and distribution was anything but. A recent study that looked at almost 800 obsidian pieces from the Aztec capital of Tenochtitlán has revealed that the versatile rock was brought there through an intricate trade network from far away. The researchers behind the study used portable X-ray fluorescence, which can identify the unique chemical composition of a given piece of obsidian to figure out where each of them came from. Most of the obsidian used by the Aztecs appears to have been sourced from Sierra de Pachuca, a mountain range around 60 miles from their capital and beyond their borders. This implies that the Aztecs were willing to engage in long-distance trade to obtain the precious resource. For the Aztecs and other Mesoamerican cultures, obsidian wasn't just a material to be made into weapons, but precious jewelry. Obsidian with green and gold coloration was particularly valued, and was known as "obsidian of the masters”. In the hands of expert craftsmen, the dangerous rocks could be transformed into delicate pieces worn by high-ranking individuals to show off their status. Obsidian was also used as inlays in sculptures and ceremonial weapons, with some pieces left as offerings for the dead to be buried with. At least the dead won't have to worry about accidentally cutting themselves.
[Image description: A piece of black obsidian on a wooden surface.] Credit & copyright: Ziongarage, Wikimedia Commons. This file is made available under the Creative Commons CC0 1.0 Universal Public Domain Dedication. -
FREEScience Daily Curio #3093Free1 CQ
You know things are bad when one natural disaster is just the beginning. A village in Switzerland has been left devastated after a landslide, and it could be the first of many to come. On May 19, the small village of Blatten was evacuated after geologists warned of impending danger. Blatten, home to 300 residents, is located in an alpine valley overlooked by glaciers. According to the geologists, one of those glaciers was coming apart rapidly. Indeed, in just a matter of days, the Birch glacier completely disintegrated, sending chunks of ice and rock down the valley. Most of the village was destroyed directly by the landslide, and the rest was flooded soon after.
Landslides can happen for all sorts of reasons like heavy rain, snowmelt, and erosion, but this one was caused entirely by the glacier's collapse. In turn, the glacier’s destruction was brought on by climate change, and similar catastrophes may await other alpine communities. In fact, another village, Brienz, was evacuated in 2023 as a precaution, and residents have only been allowed to return on a limited basis. Back in 2017, another village called Bondo was devastated by a similar landslide which claimed 8 lives. While most of the residents of Blatten were able to make their way to safety with just one individual unaccounted for, it may be too soon to breathe a sigh of relief. The debris from the landslide could still cause flooding, further devastating the area. Scientists estimate that all of Switzerland's glaciers will disappear by the end of the century; but they're unlikely to go quietly—and that's the optimistic outlook. More and more climate experts are beginning to believe that the glacial thaw will only accelerate in coming years. The term "glacial pace" might need to be redefined.
[Image description: A red train traveling between mountains in Switzerland.] Credit & copyright: Wikimedia Commons, Sikander Iqbal (Siqbal). This work has been released into the public domain by its author, Siqbal, at the English Wikipedia project. This applies worldwide.You know things are bad when one natural disaster is just the beginning. A village in Switzerland has been left devastated after a landslide, and it could be the first of many to come. On May 19, the small village of Blatten was evacuated after geologists warned of impending danger. Blatten, home to 300 residents, is located in an alpine valley overlooked by glaciers. According to the geologists, one of those glaciers was coming apart rapidly. Indeed, in just a matter of days, the Birch glacier completely disintegrated, sending chunks of ice and rock down the valley. Most of the village was destroyed directly by the landslide, and the rest was flooded soon after.
Landslides can happen for all sorts of reasons like heavy rain, snowmelt, and erosion, but this one was caused entirely by the glacier's collapse. In turn, the glacier’s destruction was brought on by climate change, and similar catastrophes may await other alpine communities. In fact, another village, Brienz, was evacuated in 2023 as a precaution, and residents have only been allowed to return on a limited basis. Back in 2017, another village called Bondo was devastated by a similar landslide which claimed 8 lives. While most of the residents of Blatten were able to make their way to safety with just one individual unaccounted for, it may be too soon to breathe a sigh of relief. The debris from the landslide could still cause flooding, further devastating the area. Scientists estimate that all of Switzerland's glaciers will disappear by the end of the century; but they're unlikely to go quietly—and that's the optimistic outlook. More and more climate experts are beginning to believe that the glacial thaw will only accelerate in coming years. The term "glacial pace" might need to be redefined.
[Image description: A red train traveling between mountains in Switzerland.] Credit & copyright: Wikimedia Commons, Sikander Iqbal (Siqbal). This work has been released into the public domain by its author, Siqbal, at the English Wikipedia project. This applies worldwide. -
FREEScience Daily Curio #3092Free1 CQ
Its communications are regular, but its location is awfully unusual. A newly-discovered cosmic object has astronomers puzzled, but finding out its identity might reveal new insights about the universe. Back in 2022, scientists coined the term long-period transient (LPT) for cosmic objects that emit light pulses on a regular basis. Since then, 10 more LPTs have been discovered, including ASKAP J1832- 0911, perhaps the most unusual among them. Recently discovered by a team of astronomers from Curtin University working at the Australian Square Kilometre Array Pathfinder (ASKAP), ASKAP J1832- 0911 appears to be emitting both radio waves, and x-rays every 44 minutes for two minutes at a time. The team studying the cosmic object discovered this phenomenon by happenstance while using NASA's Chandra X-ray telescope. Unlike ASKAP, which surveys a large swath of the sky at a time, Chandra only looks at a small portion. As luck would have it, Chandra just happened to be pointed at ASKAP J1832- 0911 at the right time when it was emitting its x-rays. For now, astronomers aren't sure just what this oddity is. According to the team of researchers, the object might be a magnestar, which is the core of a dead star known for its powerful magnetic fields. Another possibility is that it's a white dwarf, or a white dwarf and another type of object paired as a binary star system. Yet, the team admits that even these possibilities don't account for the unusual behavior of ASKAP J1832- 0911. As the lead researcher, Andy Wang, put it, "This discovery could indicate a new type of physics or new models of stellar evolution." In space, not knowing something is sometimes more exciting than having all the answers.
[Image description: A starry night sky with a line of dark trees below.] Credit & copyright: tommy haugsveen, PexelsIts communications are regular, but its location is awfully unusual. A newly-discovered cosmic object has astronomers puzzled, but finding out its identity might reveal new insights about the universe. Back in 2022, scientists coined the term long-period transient (LPT) for cosmic objects that emit light pulses on a regular basis. Since then, 10 more LPTs have been discovered, including ASKAP J1832- 0911, perhaps the most unusual among them. Recently discovered by a team of astronomers from Curtin University working at the Australian Square Kilometre Array Pathfinder (ASKAP), ASKAP J1832- 0911 appears to be emitting both radio waves, and x-rays every 44 minutes for two minutes at a time. The team studying the cosmic object discovered this phenomenon by happenstance while using NASA's Chandra X-ray telescope. Unlike ASKAP, which surveys a large swath of the sky at a time, Chandra only looks at a small portion. As luck would have it, Chandra just happened to be pointed at ASKAP J1832- 0911 at the right time when it was emitting its x-rays. For now, astronomers aren't sure just what this oddity is. According to the team of researchers, the object might be a magnestar, which is the core of a dead star known for its powerful magnetic fields. Another possibility is that it's a white dwarf, or a white dwarf and another type of object paired as a binary star system. Yet, the team admits that even these possibilities don't account for the unusual behavior of ASKAP J1832- 0911. As the lead researcher, Andy Wang, put it, "This discovery could indicate a new type of physics or new models of stellar evolution." In space, not knowing something is sometimes more exciting than having all the answers.
[Image description: A starry night sky with a line of dark trees below.] Credit & copyright: tommy haugsveen, Pexels -
FREEWork Daily Curio #3091Free1 CQ
Sharpen your pencils and loosen your wrists—the blue book is back in school. With AI-based apps like ChatGPT allowing less-than-scrupulous students to prompt their way through exams and assignments, old-fashioned blue books (blue notebooks with lined paper that were once popular at colleges) are making a comeback. Most students today have never taken a hand-written exam, in which answers are meticulously jotted down as the clock ticks away. With the advent of word processors and affordable laptops, many institutions have moved their exams to the digital space, allowing students to type their answers much faster than they could scribble on paper. That would have been that, but in recent years AI has become equally accessible, and some educators fear that it’s impacting student’s ability to think for themselves. Now, those educators are going back to the old ways. For the last hundred years or so before the advent of laptops, hand-written exams were largely done on lined, bound, paper booklets known as "blue books." Sales of blue books were actually declining until recently, but are now seeing an uptick.
Blue books are thought to have originated at Indiana’s Butler University in the 1920s, and were colored blue after the school’s color. Since then, the blue book format has been replicated by several manufacturers. However, the origins of standardized booklets in exams might date back even further. In the 1800s, Harvard University reportedly had their own booklets, though they weren't blue. Of course, not everyone is a fan of the modern blue book renaissance. Some educators believe that hurriedly-scribbled answers made under time restraints don't necessarily represent a student's understanding of a subject. Regardless of their pedagogical value, blue books may be here to stay, at least for a while. Pencils down!
[Image description: A dark blue pencil against a light blue background.] Credit & copyright: Author’s own illustration. Public domain.Sharpen your pencils and loosen your wrists—the blue book is back in school. With AI-based apps like ChatGPT allowing less-than-scrupulous students to prompt their way through exams and assignments, old-fashioned blue books (blue notebooks with lined paper that were once popular at colleges) are making a comeback. Most students today have never taken a hand-written exam, in which answers are meticulously jotted down as the clock ticks away. With the advent of word processors and affordable laptops, many institutions have moved their exams to the digital space, allowing students to type their answers much faster than they could scribble on paper. That would have been that, but in recent years AI has become equally accessible, and some educators fear that it’s impacting student’s ability to think for themselves. Now, those educators are going back to the old ways. For the last hundred years or so before the advent of laptops, hand-written exams were largely done on lined, bound, paper booklets known as "blue books." Sales of blue books were actually declining until recently, but are now seeing an uptick.
Blue books are thought to have originated at Indiana’s Butler University in the 1920s, and were colored blue after the school’s color. Since then, the blue book format has been replicated by several manufacturers. However, the origins of standardized booklets in exams might date back even further. In the 1800s, Harvard University reportedly had their own booklets, though they weren't blue. Of course, not everyone is a fan of the modern blue book renaissance. Some educators believe that hurriedly-scribbled answers made under time restraints don't necessarily represent a student's understanding of a subject. Regardless of their pedagogical value, blue books may be here to stay, at least for a while. Pencils down!
[Image description: A dark blue pencil against a light blue background.] Credit & copyright: Author’s own illustration. Public domain. -
FREEMind + Body Daily CurioFree1 CQ
That’s a lot of zip for raw fish! Ceviche is one of the world’s best warm-weather dishes, and the perfect food to examine as summer approaches. Made with raw fish, ceviche hails from Peru, where it is considered the national dish and was even mentioned in the country’s first national anthem.
Ceviche is made from raw, chilled fish and shellfish marinated in lemon, lime, or sour orange juice. The juice also contains seasonings like chili, cilantro, and sliced onions. Ceviche is often served on a large lettuce leaf and topped with tomato slices or seaweed. It may be surrounded by boiled potatoes, yucca, chickpeas, or corn. Unlike sushi or sashimi, the fish and shellfish in ceviche tastes as if it has been cooked, since the citrus marinade breaks down proteins in the meat.
Ceviche has ancient Peruvian roots. Evidence suggests that the Caral civilization, which existed around 5,000 years ago and is the oldest known civilization in the Americas, ate raw anchovies with various seasonings. Around 2,000 years ago, a group of coastal Peruvians called the Moche used fermented banana passionfruit juice to marinate raw fish. The famed Incan Empire also served raw fish marinated in fermented juices. Modern ceviche didn’t develop until at least the sixteenth century, when Spanish and Portuguese traders brought onions, lemons, and limes to the region—all of which are used in the modern version of the dish. For some time, ceviche was found mostly in coastal Peruvian towns and cities. As faster means of travel and better refrigeration techniques were developed, however, the dish's popularity surged throughout the entire country. By 1820, ceviche had become so common that it was even mentioned in La chica, a song considered to be Peru’s first national anthem.
In 2004, ceviche was declared a Cultural Heritage of Peru. Just four years later, the country’s Ministry of Production designated June 28th as Ceviche Day. It’s celebrated the day before Peru’s annual Fisherman’s Day, honoring those who make the nation’s thriving seafood culture possible. They’re sourcing national pride while being a source of it themselves.
[Image description: A white plate of ceviche surrounded by corn and other veggies.] Credit & copyright: Dtarazona, Wikimedia Commons. The copyright holder of this work has released it into the public domain. This applies worldwide.That’s a lot of zip for raw fish! Ceviche is one of the world’s best warm-weather dishes, and the perfect food to examine as summer approaches. Made with raw fish, ceviche hails from Peru, where it is considered the national dish and was even mentioned in the country’s first national anthem.
Ceviche is made from raw, chilled fish and shellfish marinated in lemon, lime, or sour orange juice. The juice also contains seasonings like chili, cilantro, and sliced onions. Ceviche is often served on a large lettuce leaf and topped with tomato slices or seaweed. It may be surrounded by boiled potatoes, yucca, chickpeas, or corn. Unlike sushi or sashimi, the fish and shellfish in ceviche tastes as if it has been cooked, since the citrus marinade breaks down proteins in the meat.
Ceviche has ancient Peruvian roots. Evidence suggests that the Caral civilization, which existed around 5,000 years ago and is the oldest known civilization in the Americas, ate raw anchovies with various seasonings. Around 2,000 years ago, a group of coastal Peruvians called the Moche used fermented banana passionfruit juice to marinate raw fish. The famed Incan Empire also served raw fish marinated in fermented juices. Modern ceviche didn’t develop until at least the sixteenth century, when Spanish and Portuguese traders brought onions, lemons, and limes to the region—all of which are used in the modern version of the dish. For some time, ceviche was found mostly in coastal Peruvian towns and cities. As faster means of travel and better refrigeration techniques were developed, however, the dish's popularity surged throughout the entire country. By 1820, ceviche had become so common that it was even mentioned in La chica, a song considered to be Peru’s first national anthem.
In 2004, ceviche was declared a Cultural Heritage of Peru. Just four years later, the country’s Ministry of Production designated June 28th as Ceviche Day. It’s celebrated the day before Peru’s annual Fisherman’s Day, honoring those who make the nation’s thriving seafood culture possible. They’re sourcing national pride while being a source of it themselves.
[Image description: A white plate of ceviche surrounded by corn and other veggies.] Credit & copyright: Dtarazona, Wikimedia Commons. The copyright holder of this work has released it into the public domain. This applies worldwide. -
FREEHumanities Daily Curio #3090Free1 CQ
Be careful calling someone a Neanderthal as an insult—you might actually be complimenting them. A team of Spanish archaeologists have announced the discovery of a fingerprint that suggests that Neanderthals were more artistically inclined than previously thought. At around 43,000 years old, the fingerprint in question was left by a Neanderthal on an unassuming granite pebble. The rock was originally discovered in 2022 at the San Lázaro rock shelter near Segovia, and at first, it wasn't clear just what the small, red dot on it was. After consulting geologists, the team found that the red color on the rock came from a pigment made of iron oxide and clay, while police forensics experts confirmed that the mark itself came from the tip of someone's finger. Although it doesn't look like much at a glance, the fingerprinted rock caught the team's attention for a number of reasons. Firstly, there was nothing else in the site that also had the red pigment on it, suggesting it was placed there deliberately after being sourced from another location. Secondly, the rock vaguely resembles a human face, and the dot just so happens to be where the nose should be. Thus, the archaeologists believe that whoever marked the rock did so to complete the face. It may sound far-fetched that a Neanderthal could make such a deliberate artistic statement, but more and more evidence suggests that they were capable of more artistic and symbolic expression that they used to be given credit for. As much as it may hurt the pride of their successors, (Homo sapiens, also known as human beings) the Neanderthals may have beaten us to the punch when it comes to developing culture. Regardless of whether or not the red dot was an intentional creation, it is now officially the oldest human fingerprint ever found. How about a round of applause for the Paleolithic Picasso?
[Image description: A painting of a Neanderthal family by a cave, with a man holding a spear out front.] Credit & copyright: Neanderthal Flintworkers, Le Moustier Cavern, Dordogne, France, Charles Robert Knight (1874–1953). American Museum of Natural History, Public Domain.Be careful calling someone a Neanderthal as an insult—you might actually be complimenting them. A team of Spanish archaeologists have announced the discovery of a fingerprint that suggests that Neanderthals were more artistically inclined than previously thought. At around 43,000 years old, the fingerprint in question was left by a Neanderthal on an unassuming granite pebble. The rock was originally discovered in 2022 at the San Lázaro rock shelter near Segovia, and at first, it wasn't clear just what the small, red dot on it was. After consulting geologists, the team found that the red color on the rock came from a pigment made of iron oxide and clay, while police forensics experts confirmed that the mark itself came from the tip of someone's finger. Although it doesn't look like much at a glance, the fingerprinted rock caught the team's attention for a number of reasons. Firstly, there was nothing else in the site that also had the red pigment on it, suggesting it was placed there deliberately after being sourced from another location. Secondly, the rock vaguely resembles a human face, and the dot just so happens to be where the nose should be. Thus, the archaeologists believe that whoever marked the rock did so to complete the face. It may sound far-fetched that a Neanderthal could make such a deliberate artistic statement, but more and more evidence suggests that they were capable of more artistic and symbolic expression that they used to be given credit for. As much as it may hurt the pride of their successors, (Homo sapiens, also known as human beings) the Neanderthals may have beaten us to the punch when it comes to developing culture. Regardless of whether or not the red dot was an intentional creation, it is now officially the oldest human fingerprint ever found. How about a round of applause for the Paleolithic Picasso?
[Image description: A painting of a Neanderthal family by a cave, with a man holding a spear out front.] Credit & copyright: Neanderthal Flintworkers, Le Moustier Cavern, Dordogne, France, Charles Robert Knight (1874–1953). American Museum of Natural History, Public Domain. -
FREEUS History Daily Curio #3089Free1 CQ
What happens when you take the "mutually" out of "mutually assured destruction?” The answer, surprisingly, is a problem. The newly announced missile defense system dubbed the "Golden Dome" is drawing comparisons to President Ronald Reagan's Strategic Defense Initiative (SDI). While SDI was similar to the Golden Dome in many ways, the circumstances of its conception gave rise to a distinctly different set of issues.
As far as most Americans in the 1980s were concerned, the Cold War was a conflict without end. The U.S. and the Soviet Union were engaged in a morbid and seemingly inescapable mandate—that of mutually assured destruction (MAD). Both sides were armed with thousands of nuclear weapons ready to strike, set to launch in kind should either party decide to use them. In 1983, President Reagan proposed a way for the U.S. to finally gain the elusive upper hand. The plan was called the Strategic Defense Initiative (SDI), and would have used satellites in space equipped with laser weaponry to shoot down any intercontinental ballistic missiles (ICBM) launched by the Soviet Union.
Critics judged the plan to be infeasible and unrealistic, calling it "Star Wars" after the movie franchise of the same name. Indeed, the technology to make such a defense system didn’t exist yet. Even today, laser weaponry is mostly experimental in nature. Reagan’s plan also had the potential to be a foreign policy disaster. Whereas MAD had made the use of nuclear weapons forbidden by default, by announcing the SDI, the U.S. was announcing that it was essentially ready to take the "mutually" out of MAD. Thus, the very existence of the plan was seen as a sign of aggression, though the infeasible nature of the technology soon eased those concerns. There were also fears that successfully rendering nuclear weapons useless for one side would simply encourage an arms race of another kind. Ultimately, the SDI was scrapped by the 1990s as the end of the Cold War reduced the incentive to develop them. We did end up getting more Star Wars movies though, so that's something.
[Image description: A blue sky with a single, white cloud.] Credit & copyright: Dinkum, Wikimedia Commons. Creative Commons Zero, Public Domain Dedication.What happens when you take the "mutually" out of "mutually assured destruction?” The answer, surprisingly, is a problem. The newly announced missile defense system dubbed the "Golden Dome" is drawing comparisons to President Ronald Reagan's Strategic Defense Initiative (SDI). While SDI was similar to the Golden Dome in many ways, the circumstances of its conception gave rise to a distinctly different set of issues.
As far as most Americans in the 1980s were concerned, the Cold War was a conflict without end. The U.S. and the Soviet Union were engaged in a morbid and seemingly inescapable mandate—that of mutually assured destruction (MAD). Both sides were armed with thousands of nuclear weapons ready to strike, set to launch in kind should either party decide to use them. In 1983, President Reagan proposed a way for the U.S. to finally gain the elusive upper hand. The plan was called the Strategic Defense Initiative (SDI), and would have used satellites in space equipped with laser weaponry to shoot down any intercontinental ballistic missiles (ICBM) launched by the Soviet Union.
Critics judged the plan to be infeasible and unrealistic, calling it "Star Wars" after the movie franchise of the same name. Indeed, the technology to make such a defense system didn’t exist yet. Even today, laser weaponry is mostly experimental in nature. Reagan’s plan also had the potential to be a foreign policy disaster. Whereas MAD had made the use of nuclear weapons forbidden by default, by announcing the SDI, the U.S. was announcing that it was essentially ready to take the "mutually" out of MAD. Thus, the very existence of the plan was seen as a sign of aggression, though the infeasible nature of the technology soon eased those concerns. There were also fears that successfully rendering nuclear weapons useless for one side would simply encourage an arms race of another kind. Ultimately, the SDI was scrapped by the 1990s as the end of the Cold War reduced the incentive to develop them. We did end up getting more Star Wars movies though, so that's something.
[Image description: A blue sky with a single, white cloud.] Credit & copyright: Dinkum, Wikimedia Commons. Creative Commons Zero, Public Domain Dedication. -
FREEBiology Daily Curio #3088Free1 CQ
The birds, they are a-changin’. New research shows that hummingbird feeders are not only helping hummingbirds expand their range, but driving them to evolve as well. Millions of Americans enjoy leaving out feeders full of sugar water for hummingbirds, simply to catch a glimpse of the tiny, colorful creatures. Such feeders became popular after WWII, though they've been around even longer. Homemade feeders and instructions on how to make them existed for decades before a patent was filed for a mass-produced version in 1947. In the western U.S., Anna's hummingbirds (Calypte anna) have been able to greatly expand their range thanks to the charity of their admirers. More specifically, they've been able to go further north, out of their usual Southern California range. Part of their expansion has to do with the eucalyptus trees that were planted throughout California in the 19th century, but the feeders are mostly responsible.
There's also something subtler going on with the SoCal natives thanks to those feeders. Their beaks have been changing over the last few generations, probably to be more efficient at drawing the nectar from feeders as opposed to flowers. According to researchers, Anna's hummingbirds’ beaks have been getting longer and more tapered, showing that the feeders have become more than a supplementary source of sustenance for the birds—they’re now central to their diet. The birds are even prioritizing manmade feeders over flowers in some areas. Researchers believe that hummingbirds have come to prefer them since the feeders are practically inexhaustible sources of “nectar” compared to flowers. Birds may even be competing for who gets to stay at them the longest. Those flitting balls of feathers are ready to throw down for some good sugar water.
[Image description: A blue hummingbird sipping at a red feeder.] Credit & copyright: Someguy1221, Wikimedia Commons. This work has been released into the public domain by its author, Someguy1221. This applies worldwide.The birds, they are a-changin’. New research shows that hummingbird feeders are not only helping hummingbirds expand their range, but driving them to evolve as well. Millions of Americans enjoy leaving out feeders full of sugar water for hummingbirds, simply to catch a glimpse of the tiny, colorful creatures. Such feeders became popular after WWII, though they've been around even longer. Homemade feeders and instructions on how to make them existed for decades before a patent was filed for a mass-produced version in 1947. In the western U.S., Anna's hummingbirds (Calypte anna) have been able to greatly expand their range thanks to the charity of their admirers. More specifically, they've been able to go further north, out of their usual Southern California range. Part of their expansion has to do with the eucalyptus trees that were planted throughout California in the 19th century, but the feeders are mostly responsible.
There's also something subtler going on with the SoCal natives thanks to those feeders. Their beaks have been changing over the last few generations, probably to be more efficient at drawing the nectar from feeders as opposed to flowers. According to researchers, Anna's hummingbirds’ beaks have been getting longer and more tapered, showing that the feeders have become more than a supplementary source of sustenance for the birds—they’re now central to their diet. The birds are even prioritizing manmade feeders over flowers in some areas. Researchers believe that hummingbirds have come to prefer them since the feeders are practically inexhaustible sources of “nectar” compared to flowers. Birds may even be competing for who gets to stay at them the longest. Those flitting balls of feathers are ready to throw down for some good sugar water.
[Image description: A blue hummingbird sipping at a red feeder.] Credit & copyright: Someguy1221, Wikimedia Commons. This work has been released into the public domain by its author, Someguy1221. This applies worldwide. -
FREEScience Daily Curio #3087Free1 CQ
Where there's smoke, there's fire; and where there's green, there's bound to be lava. At least, that's what scientists are beginning to believe after looking at satellite images of trees growing near volcanoes. As destructive as volcanic eruptions can be, there's never been a reliable way to predict them. That's a huge problem for the many communities around the world that live around active volcanoes. Sure, not all eruptions are cataclysmic events filled with pyroclastic blasts, but lava is dangerous no matter how you look at it. Until now, scientists have been able to gauge the risk of a volcanic eruption happening by measuring seismic waves and even the rise of the ground level around a volcano, but such data can't show exactly when the eruption will occur. Yet, there may be hope of accurately forecasting eruptions in the future.
For a long time, scientists have noticed that trees near volcanoes get greener before eruptions. Apparently, as magma builds up under the Earth's crust, it creates pressure underground that forces carbon dioxide to rise through the surface, which in turn feeds the trees and helps them grow. It seems simple enough, then, to measure the increase in carbon dioxide levels, but even the amount that comes up through the ground to jazz up the greenery isn't easily measurable with existing equipment. Compared to the amount of carbon dioxide in the atmosphere, the amount that comes up is too small. However, using satellite images provided by NASA's Orbiting Carbon Observatory 2, volcanologists are figuring out how to measure these carbon dioxide changes indirectly by tracking the surrounding vegetation instead. The process still requires more data to get a better understanding of the correlation between volcanoes and changes in the plants around them, and it won't help with volcanoes that are located in environments without vegetation, but it might one day help protect the 10 percent of the world's population who live near active volcanos. Until then, may cooler eruptions prevail.
[Image description: A cluster of oak leaves against a green background.] Credit & copyright: W.carter, Wikimedia Commons. This file is made available under the Creative Commons CC0 1.0 Universal Public Domain Dedication.Where there's smoke, there's fire; and where there's green, there's bound to be lava. At least, that's what scientists are beginning to believe after looking at satellite images of trees growing near volcanoes. As destructive as volcanic eruptions can be, there's never been a reliable way to predict them. That's a huge problem for the many communities around the world that live around active volcanoes. Sure, not all eruptions are cataclysmic events filled with pyroclastic blasts, but lava is dangerous no matter how you look at it. Until now, scientists have been able to gauge the risk of a volcanic eruption happening by measuring seismic waves and even the rise of the ground level around a volcano, but such data can't show exactly when the eruption will occur. Yet, there may be hope of accurately forecasting eruptions in the future.
For a long time, scientists have noticed that trees near volcanoes get greener before eruptions. Apparently, as magma builds up under the Earth's crust, it creates pressure underground that forces carbon dioxide to rise through the surface, which in turn feeds the trees and helps them grow. It seems simple enough, then, to measure the increase in carbon dioxide levels, but even the amount that comes up through the ground to jazz up the greenery isn't easily measurable with existing equipment. Compared to the amount of carbon dioxide in the atmosphere, the amount that comes up is too small. However, using satellite images provided by NASA's Orbiting Carbon Observatory 2, volcanologists are figuring out how to measure these carbon dioxide changes indirectly by tracking the surrounding vegetation instead. The process still requires more data to get a better understanding of the correlation between volcanoes and changes in the plants around them, and it won't help with volcanoes that are located in environments without vegetation, but it might one day help protect the 10 percent of the world's population who live near active volcanos. Until then, may cooler eruptions prevail.
[Image description: A cluster of oak leaves against a green background.] Credit & copyright: W.carter, Wikimedia Commons. This file is made available under the Creative Commons CC0 1.0 Universal Public Domain Dedication. -
FREEMind + Body Daily CurioFree1 CQ
Sweet, gooey, spicy…nice! Cinnamon rolls are beloved throughout much of the world for their unique softness and interesting shape. Although they’re not as heavily associated with their country of origin as macarons are with France or as cannolis are with Italy, cinnamon rolls were almost certainly invented in Sweden. The country even celebrates their native pastry every on a special day each year.
Cinnamon rolls are made from yeast-leavened, enriched dough. This dough adds butter, sugar, and eggs to the usual flour and milk, which helps make it soft and puffy. The dough is then spread out and rolled up, buttered, then sprinkled with cinnamon, sugar, and sometimes toppings like raisins or nuts. After baking, cinnamon rolls are often drizzled with thick icing.
Ancient Romans began using cinnamon from Sri Lanka centuries before it became common in other European countries. Besides food, the Romans used cinnamon in perfumes, religious incense, and for medicinal purposes. It was likely the Romans that introduced Sweden to cinnamon. The first record of its use there is a 14th century recipe for mulled beer, but it wasn’t long before it made its way into Swedish pastries. By the 17th century, cinnamon was common throughout Europe, and various European desserts called for it, but none were as similar to modern cinnamon rolls as Swedish kanelbulles, or “cinnamon buns.” There are some differences, though. Kanelbulle dough usually contains cardamom, for one thing. They’re also not usually iced, and are instead topped with pearl sugar.
A Swedish population boom coupled with a difficult Swedish economy caused millions of Swedes to immigrate to the U.S. starting in the early 19th century. They brought their pastries with them, and cinnamon roll hotspots began popping up across the country. They became particularly popular in Philadelphia, where German immigrants made them even sweeter (and gooier) by adding molasses and brown sugar. At some point, probably after World War II, icing became a common staple of American cinnamon rolls, taking the soft pastries’ sweetness to a new level. Count on the U.S. to find new ways to add even more sugar to their snacks.
[Image description: A plate of cinnamon rolls with white icing.] Credit & copyright: Alcinoe, Wikimedia Commons. The copyright holder of this work has released it into the public domain. This applies worldwide.Sweet, gooey, spicy…nice! Cinnamon rolls are beloved throughout much of the world for their unique softness and interesting shape. Although they’re not as heavily associated with their country of origin as macarons are with France or as cannolis are with Italy, cinnamon rolls were almost certainly invented in Sweden. The country even celebrates their native pastry every on a special day each year.
Cinnamon rolls are made from yeast-leavened, enriched dough. This dough adds butter, sugar, and eggs to the usual flour and milk, which helps make it soft and puffy. The dough is then spread out and rolled up, buttered, then sprinkled with cinnamon, sugar, and sometimes toppings like raisins or nuts. After baking, cinnamon rolls are often drizzled with thick icing.
Ancient Romans began using cinnamon from Sri Lanka centuries before it became common in other European countries. Besides food, the Romans used cinnamon in perfumes, religious incense, and for medicinal purposes. It was likely the Romans that introduced Sweden to cinnamon. The first record of its use there is a 14th century recipe for mulled beer, but it wasn’t long before it made its way into Swedish pastries. By the 17th century, cinnamon was common throughout Europe, and various European desserts called for it, but none were as similar to modern cinnamon rolls as Swedish kanelbulles, or “cinnamon buns.” There are some differences, though. Kanelbulle dough usually contains cardamom, for one thing. They’re also not usually iced, and are instead topped with pearl sugar.
A Swedish population boom coupled with a difficult Swedish economy caused millions of Swedes to immigrate to the U.S. starting in the early 19th century. They brought their pastries with them, and cinnamon roll hotspots began popping up across the country. They became particularly popular in Philadelphia, where German immigrants made them even sweeter (and gooier) by adding molasses and brown sugar. At some point, probably after World War II, icing became a common staple of American cinnamon rolls, taking the soft pastries’ sweetness to a new level. Count on the U.S. to find new ways to add even more sugar to their snacks.
[Image description: A plate of cinnamon rolls with white icing.] Credit & copyright: Alcinoe, Wikimedia Commons. The copyright holder of this work has released it into the public domain. This applies worldwide. -
FREEEngineering Daily Curio #3086Free1 CQ
Nothing's worse than having your fate up in the air when you’re up in the air. A passenger aircraft belonging to the German airline Lufthansa recently reached its destination safely despite neither pilot being at the controls for a time, and it was all thanks to the plane’s autopilot system. Autopilot is a tremendous help to modern day pilots, but it wasn’t always as reliable as it is today.
Last year, a Lufthansa plane carrying 199 passengers from Frankfurt, Germany, to Seville, Spain, encountered a potential disaster. While the captain was away from the flight deck, the first officer became unconscious. When the captain attempted to return to the cockpit, he found himself locked out without any response from his colleague. Fortunately, the first officer regained consciousness within a matter of minutes, but for a time—however brief—no human pilot was at the controls of the plane. The incident has only recently been publicly revealed by the airline after an investigation took place.
In the early days of aviation, such a situation would have certainly led to tragedy. Older aircraft required the constant and meticulous attention of their pilots, who had to make minute adjustments to keep their planes aloft. The first autopilot system was invented by Lawrence Sperry, whose gyroscopic automatic pilot (nicknamed "George") automatically kept planes in balance. The first digital autopilot systems were developed in the 1970s in response to data showing that most crashes occurred due to human error. Today, autopilot systems are usually integrated into a plane's flight management system, and most of the small adjustments are taken care of by onboard computers. Contrary to popular belief, autopilot systems can't fully control a plane over the entire course of its journey. Pilots fully control aircraft during takeoff and landing, which are the most difficult parts of most flights. They also maintain communication with ground crews so that they can change course in case of emergencies, stay clear of other aircraft, and let airports know exactly when they’ll be landing. Autopilots mainly maintain a plane’s course and altitude, including in emergencies. It’s a life-saving invention for sure, but most people would probably still prefer that their pilot be conscious.
[Image description: A blue sky with a single, white cloud.] Credit & copyright: Dinkum, Wikimedia Commons. Creative Commons Zero, Public Domain Dedication.Nothing's worse than having your fate up in the air when you’re up in the air. A passenger aircraft belonging to the German airline Lufthansa recently reached its destination safely despite neither pilot being at the controls for a time, and it was all thanks to the plane’s autopilot system. Autopilot is a tremendous help to modern day pilots, but it wasn’t always as reliable as it is today.
Last year, a Lufthansa plane carrying 199 passengers from Frankfurt, Germany, to Seville, Spain, encountered a potential disaster. While the captain was away from the flight deck, the first officer became unconscious. When the captain attempted to return to the cockpit, he found himself locked out without any response from his colleague. Fortunately, the first officer regained consciousness within a matter of minutes, but for a time—however brief—no human pilot was at the controls of the plane. The incident has only recently been publicly revealed by the airline after an investigation took place.
In the early days of aviation, such a situation would have certainly led to tragedy. Older aircraft required the constant and meticulous attention of their pilots, who had to make minute adjustments to keep their planes aloft. The first autopilot system was invented by Lawrence Sperry, whose gyroscopic automatic pilot (nicknamed "George") automatically kept planes in balance. The first digital autopilot systems were developed in the 1970s in response to data showing that most crashes occurred due to human error. Today, autopilot systems are usually integrated into a plane's flight management system, and most of the small adjustments are taken care of by onboard computers. Contrary to popular belief, autopilot systems can't fully control a plane over the entire course of its journey. Pilots fully control aircraft during takeoff and landing, which are the most difficult parts of most flights. They also maintain communication with ground crews so that they can change course in case of emergencies, stay clear of other aircraft, and let airports know exactly when they’ll be landing. Autopilots mainly maintain a plane’s course and altitude, including in emergencies. It’s a life-saving invention for sure, but most people would probably still prefer that their pilot be conscious.
[Image description: A blue sky with a single, white cloud.] Credit & copyright: Dinkum, Wikimedia Commons. Creative Commons Zero, Public Domain Dedication. -
FREEBiology Daily Curio #3085Free1 CQ
Orange you glad they’ve solved this kitty mystery? After years of puzzlement, scientists now know what gives ginger cats their distinct orange coloration, and why so many orange cats are male. Owners of orange cats have long posited that there’s something special about them. They tend to have sillier, more laid-back personalities than average cats, and that temperament has earned them devoted followers. Among them is Professor Hiroyuki Sasaki, a geneticist at Kyushu University in Japan who crowdfunded his effort to unravel the secret behind the cats' unique fur. Scientists knew that there had to be a unique genetic link between their fur color and their sex, since orange cats are overwhelmingly male just as calico cats are overwhelmingly female. Sasaki and his colleagues raised around $70,000 to perform their research, mostly from fellow cat lovers. Their efforts paid off, and the culprit was identified: the ARHGAP36 gene, or rather, a mutation in the X chromosome that deleted a section of it in some cats. The ARHGAP36 gene is responsible for pheomelanin, the type of melanin responsible for red, orange, and yellow pigments in mammals, as opposed to eumelanin, which controls brown to black pigment. In cats with the mutation, the ARHGAP36 gene goes haywire, producing much more than it normally would. That it only occurs in the X chromosome also explains the skewed sex ratio. Since male cats only have one X chromosome, the mutation goes uncorrected. Meanwhile, female cats require the mutation to exist in both of their two X chromosomes. If only one is affected, they end up as calicos. But there might be more to the gene than meets the eye. The ARHGAP36 gene might also play a role in orange cats’ personalities. The gene is responsible for other functions in the brain and hormonal glands, so it’s possible that it produces unique inclinations and behaviors. Now if only we knew why we love cats even when they’re indifferent to us.
[Image description: An orange tabby cat lying on gray carpet.] Credit & copyright: Brian Adler, Wikimedia Commons.Orange you glad they’ve solved this kitty mystery? After years of puzzlement, scientists now know what gives ginger cats their distinct orange coloration, and why so many orange cats are male. Owners of orange cats have long posited that there’s something special about them. They tend to have sillier, more laid-back personalities than average cats, and that temperament has earned them devoted followers. Among them is Professor Hiroyuki Sasaki, a geneticist at Kyushu University in Japan who crowdfunded his effort to unravel the secret behind the cats' unique fur. Scientists knew that there had to be a unique genetic link between their fur color and their sex, since orange cats are overwhelmingly male just as calico cats are overwhelmingly female. Sasaki and his colleagues raised around $70,000 to perform their research, mostly from fellow cat lovers. Their efforts paid off, and the culprit was identified: the ARHGAP36 gene, or rather, a mutation in the X chromosome that deleted a section of it in some cats. The ARHGAP36 gene is responsible for pheomelanin, the type of melanin responsible for red, orange, and yellow pigments in mammals, as opposed to eumelanin, which controls brown to black pigment. In cats with the mutation, the ARHGAP36 gene goes haywire, producing much more than it normally would. That it only occurs in the X chromosome also explains the skewed sex ratio. Since male cats only have one X chromosome, the mutation goes uncorrected. Meanwhile, female cats require the mutation to exist in both of their two X chromosomes. If only one is affected, they end up as calicos. But there might be more to the gene than meets the eye. The ARHGAP36 gene might also play a role in orange cats’ personalities. The gene is responsible for other functions in the brain and hormonal glands, so it’s possible that it produces unique inclinations and behaviors. Now if only we knew why we love cats even when they’re indifferent to us.
[Image description: An orange tabby cat lying on gray carpet.] Credit & copyright: Brian Adler, Wikimedia Commons. -
FREEPhysics Daily Curio #3084Free1 CQ
When it comes to the end date of the universe, what's a few orders of magnitude? Scientists at Radboud University in Nijmegen, Netherlands, have found that the universe might end much earlier than expected—but it's still a very, very long way away.
Estimating the universe’s remaining days might seem like an unfathomably impossible task, but physicists have come up with a few ways to figure it out. One method involves calculating how long it takes for stars to die. Larger stars collapse in on themselves, cause supernovas, and become black holes. Smaller stars will leave a nebula when they die, as well as a hot, dense core called a white dwarf. Most stars in the universe will become white dwarves in about 17 trillion years, but the story doesn't end there. Both white dwarves and black holes decay over time, and they do so at an astronomically glacial pace. Their decay releases Hawking radiation, named after the late astrophysicist Stephen Hawking, who first predicted the process.
Hawking only ever posited that black holes would decay in this way, but the scientists at Radboud University believe that white dwarves can also decay similarly. Since white dwarves were believed to linger on much longer, it was believed that it would take around 10 to the power of 1,100 years for the last remaining stars to die out for good. However, if they decay similarly to black holes, that number comes way down. That's not to say that it will happen anytime soon. It will still be another 10 to the power of 78 years, or one quinvigintillion years. By then, there certainly won't be anyone left to say, "Lights out!"
[Image description: A starry sky above a line of dark trees] Credit & copyright: tommy haugsveen, PexelsWhen it comes to the end date of the universe, what's a few orders of magnitude? Scientists at Radboud University in Nijmegen, Netherlands, have found that the universe might end much earlier than expected—but it's still a very, very long way away.
Estimating the universe’s remaining days might seem like an unfathomably impossible task, but physicists have come up with a few ways to figure it out. One method involves calculating how long it takes for stars to die. Larger stars collapse in on themselves, cause supernovas, and become black holes. Smaller stars will leave a nebula when they die, as well as a hot, dense core called a white dwarf. Most stars in the universe will become white dwarves in about 17 trillion years, but the story doesn't end there. Both white dwarves and black holes decay over time, and they do so at an astronomically glacial pace. Their decay releases Hawking radiation, named after the late astrophysicist Stephen Hawking, who first predicted the process.
Hawking only ever posited that black holes would decay in this way, but the scientists at Radboud University believe that white dwarves can also decay similarly. Since white dwarves were believed to linger on much longer, it was believed that it would take around 10 to the power of 1,100 years for the last remaining stars to die out for good. However, if they decay similarly to black holes, that number comes way down. That's not to say that it will happen anytime soon. It will still be another 10 to the power of 78 years, or one quinvigintillion years. By then, there certainly won't be anyone left to say, "Lights out!"
[Image description: A starry sky above a line of dark trees] Credit & copyright: tommy haugsveen, Pexels -
FREEUS History Daily Curio #3083Free1 CQ
Here’s something from the lost-and-found bin of history. The lost colony of Roanoke is one of the most enduring mysteries in American history, but one self-proclaimed amateur archaeologist now says that he’s solved it. Either way, the story of Roanoke is equal parts intriguing and tragic.
Before Jamestown, the first successful English colony in America, attempts were made to establish a colony on Roanoke Island (located in what is now North Carolina). The colony was meant to serve as England's foothold on the "New World" as they competed against the Spanish, and would have served as a base of operations for English privateers. However, the first attempt in 1585 by Ralph Lane ended in disaster, especially after relations with the nearby Algonquians soured. The second attempt, which began in 1587, lasted just a few months before one of the colonists, John White, had to return to England to raise supplies and funding. White left behind his wife, daughter and granddaughter, the first English child to be born in America. When he returned three years later, however, White’s family was nowhere to be found. Carved into nearby trees was "CROATOAN," referring to the Native American tribe who lived on Hatteras Island. Tragically, dangerous weather kept White from reaching the island, and he was forced to return to an England that had lost interest in the colony.
White died in 1606, never having found his family, but there have been some clues and hoaxes regarding their ultimate fate. Artifacts known as the Dare Stones inscribed with writing supposedly tell the story of the survivors, though their authenticity isn't widely accepted. Archaeologists have found traces of settlements nearby that may have belonged to Roanoke colonists who scattered around the area. Now, Scott Dawson, the president of the Croatian Archaeological Society, claims to have found remnants of hammerscale—bits of molten iron leftover from the forging process—on nearby Hatteras Island. Dawson claims that the hammerscale proves that the English colonists who once inhabited Roanoke Island must have fled there, since Native Americans at the time didn't have the means to forge iron. His evidence is compelling, but it might be too late to definitively solve a mystery that happened so long ago. At least by being lost, the settlers of Roanoke will never be forgotten.
[Image description: A map from 1590 showing an area spanning from Cape Fear to Chesapeake Bay, including the area in which the colony of Roanoke stood.] Credit & copyright: Library of Congress, Geography and Map Division. 1590. Public Domain.Here’s something from the lost-and-found bin of history. The lost colony of Roanoke is one of the most enduring mysteries in American history, but one self-proclaimed amateur archaeologist now says that he’s solved it. Either way, the story of Roanoke is equal parts intriguing and tragic.
Before Jamestown, the first successful English colony in America, attempts were made to establish a colony on Roanoke Island (located in what is now North Carolina). The colony was meant to serve as England's foothold on the "New World" as they competed against the Spanish, and would have served as a base of operations for English privateers. However, the first attempt in 1585 by Ralph Lane ended in disaster, especially after relations with the nearby Algonquians soured. The second attempt, which began in 1587, lasted just a few months before one of the colonists, John White, had to return to England to raise supplies and funding. White left behind his wife, daughter and granddaughter, the first English child to be born in America. When he returned three years later, however, White’s family was nowhere to be found. Carved into nearby trees was "CROATOAN," referring to the Native American tribe who lived on Hatteras Island. Tragically, dangerous weather kept White from reaching the island, and he was forced to return to an England that had lost interest in the colony.
White died in 1606, never having found his family, but there have been some clues and hoaxes regarding their ultimate fate. Artifacts known as the Dare Stones inscribed with writing supposedly tell the story of the survivors, though their authenticity isn't widely accepted. Archaeologists have found traces of settlements nearby that may have belonged to Roanoke colonists who scattered around the area. Now, Scott Dawson, the president of the Croatian Archaeological Society, claims to have found remnants of hammerscale—bits of molten iron leftover from the forging process—on nearby Hatteras Island. Dawson claims that the hammerscale proves that the English colonists who once inhabited Roanoke Island must have fled there, since Native Americans at the time didn't have the means to forge iron. His evidence is compelling, but it might be too late to definitively solve a mystery that happened so long ago. At least by being lost, the settlers of Roanoke will never be forgotten.
[Image description: A map from 1590 showing an area spanning from Cape Fear to Chesapeake Bay, including the area in which the colony of Roanoke stood.] Credit & copyright: Library of Congress, Geography and Map Division. 1590. Public Domain. -
FREEMind + Body Daily CurioFree1 CQ
The pot of gold at the end of the rainbow might actually be a ramekin of crème brûlée! This beautiful, golden-brown dessert is one of France’s most famous dishes. Yet, England and Spain also claim to have invented it.
Crème brûlée is made from custard which is baked in a water bath. The custard itself is made with heavy cream, egg yolks, sugar, and, usually, vanilla. The dessert is served in the same, small ramekins in which it is baked, and it’s topped with sugar that is caramelized using a blowtorch or broiler. The crust is sometimes dowsed with liqueur and set on fire during serving to give the crust a more intense flavor.
While crème brûlée is heavily associated with France (the dish’s name means “burnt cream” in French) no one knows exactly where it was first made. In England, custard desserts have been eaten since at least the Middle Ages. In the 17th century, Cambridge College began serving a custard dessert with a sugar crust called Trinity cream, with the crest of Cambridge burned into the crust. This doesn’t necessarily mean that England was the first to invent crème brûlée, since recipes for the French version also appeared around the same time as recipes for Trinity Cream.
Spain also claims to have invented crème brûlée. Since the Middle Ages, a dish called creme catalana, flavored with lemon or orange zest, has been served throughout the country. Milk is usually used instead of cream, and cinnamon is often added to the sugar crust.
Of course, France is best known as the birthplace of crème brûlée, as one of the oldest written recipes for the dessert can be traced to France in 1691. At the time, the dessert was popular at the Palace of Versailles, and thus gained an elegant reputation. As cookbooks became more common, the dessert made its way from the noble classes to everyday people, and today it’s served in French restaurants all over the world. Its recipe is largely unchanged from the 1691 version. If the sugar crust isn’t broken, don’t fix it!
[Image description: A white ramekin of crème brûlée on a white plate with silverware in the background.] Credit & copyright: Romainbehar, Wikimedia Commons. This file is made available under the Creative Commons CC0 1.0 Universal Public Domain Dedication.The pot of gold at the end of the rainbow might actually be a ramekin of crème brûlée! This beautiful, golden-brown dessert is one of France’s most famous dishes. Yet, England and Spain also claim to have invented it.
Crème brûlée is made from custard which is baked in a water bath. The custard itself is made with heavy cream, egg yolks, sugar, and, usually, vanilla. The dessert is served in the same, small ramekins in which it is baked, and it’s topped with sugar that is caramelized using a blowtorch or broiler. The crust is sometimes dowsed with liqueur and set on fire during serving to give the crust a more intense flavor.
While crème brûlée is heavily associated with France (the dish’s name means “burnt cream” in French) no one knows exactly where it was first made. In England, custard desserts have been eaten since at least the Middle Ages. In the 17th century, Cambridge College began serving a custard dessert with a sugar crust called Trinity cream, with the crest of Cambridge burned into the crust. This doesn’t necessarily mean that England was the first to invent crème brûlée, since recipes for the French version also appeared around the same time as recipes for Trinity Cream.
Spain also claims to have invented crème brûlée. Since the Middle Ages, a dish called creme catalana, flavored with lemon or orange zest, has been served throughout the country. Milk is usually used instead of cream, and cinnamon is often added to the sugar crust.
Of course, France is best known as the birthplace of crème brûlée, as one of the oldest written recipes for the dessert can be traced to France in 1691. At the time, the dessert was popular at the Palace of Versailles, and thus gained an elegant reputation. As cookbooks became more common, the dessert made its way from the noble classes to everyday people, and today it’s served in French restaurants all over the world. Its recipe is largely unchanged from the 1691 version. If the sugar crust isn’t broken, don’t fix it!
[Image description: A white ramekin of crème brûlée on a white plate with silverware in the background.] Credit & copyright: Romainbehar, Wikimedia Commons. This file is made available under the Creative Commons CC0 1.0 Universal Public Domain Dedication.