Curio Cabinet
- By Date
- By Type
July 10, 2025
-
8 minFREEWork Business CurioFree5 CQ
From the BBC World Service: Lesotho has declared a national state of disaster that will last for two years. The government says soaring youth unemployment an...
From the BBC World Service: Lesotho has declared a national state of disaster that will last for two years. The government says soaring youth unemployment an...
-
FREEScience Nerdy CurioFree1 CQ
These mountains look cool, but they can be real hotheads. Researchers from the University of Wisconsin–Madison have presented a study at the Goldschmidt Conference in Prague suggesting that dormant volcanoes around the world may become more active as a result of melting glaciers. First, some clarification: there are three main volcano classifications depending on their level of activity. “Active” means that the volcano has erupted during the Holocene epoch (the last 11,650 years or so) and has the potential to erupt again in the future. “Extinct” means that, as far as anyone can tell, the volcano is unlikely to ever erupt again (though it happens from time to time). “Dormant”, on the other hand, means “potentially active,” as in, it’s an active volcano (the first classification) that’s just not erupting presently, as opposed to “actively erupting,” which means magma is currently coming out of the ground.
A lot of factors contribute to a volcano’s dormancy, and scientists have found that glaciers are one of them. Researchers tracked volcanic activity by measuring the radioactive decay of argon in crystals formed in magmatic rock. They then compared that to the level of ice cover during the peak of the last ice age. What the data seems to suggest is that the ice cover acted as a lid, inhibiting eruptions. As the ice melted, volcanoes became more active. Currently, there are an estimated 245 dormant volcanoes buried under three miles of ice, and many of them are in Antarctica. Once these begin to erupt due to the reduction in ice cover, it may create a feedback loop as the eruptions themselves further melt the ice. It seems there will be an icy reception before things really heat up.
[Image description: A portion of the Andes mountain range between Chile and Argentina, photographed from far above.] Credit & copyright: Jorge Morales Piderit, Wikimedia Commons. The copyright holder of this work has released it into the public domain. This applies worldwide.
These mountains look cool, but they can be real hotheads. Researchers from the University of Wisconsin–Madison have presented a study at the Goldschmidt Conference in Prague suggesting that dormant volcanoes around the world may become more active as a result of melting glaciers. First, some clarification: there are three main volcano classifications depending on their level of activity. “Active” means that the volcano has erupted during the Holocene epoch (the last 11,650 years or so) and has the potential to erupt again in the future. “Extinct” means that, as far as anyone can tell, the volcano is unlikely to ever erupt again (though it happens from time to time). “Dormant”, on the other hand, means “potentially active,” as in, it’s an active volcano (the first classification) that’s just not erupting presently, as opposed to “actively erupting,” which means magma is currently coming out of the ground.
A lot of factors contribute to a volcano’s dormancy, and scientists have found that glaciers are one of them. Researchers tracked volcanic activity by measuring the radioactive decay of argon in crystals formed in magmatic rock. They then compared that to the level of ice cover during the peak of the last ice age. What the data seems to suggest is that the ice cover acted as a lid, inhibiting eruptions. As the ice melted, volcanoes became more active. Currently, there are an estimated 245 dormant volcanoes buried under three miles of ice, and many of them are in Antarctica. Once these begin to erupt due to the reduction in ice cover, it may create a feedback loop as the eruptions themselves further melt the ice. It seems there will be an icy reception before things really heat up.
[Image description: A portion of the Andes mountain range between Chile and Argentina, photographed from far above.] Credit & copyright: Jorge Morales Piderit, Wikimedia Commons. The copyright holder of this work has released it into the public domain. This applies worldwide.
-
FREEWorld History Daily Curio #3114Free1 CQ
These are some not-so-fresh kicks. Archaeologists in England have unearthed 2,000-year-old pairs of Roman shoes, and they’re some of the best preserved footwear from the era. The researchers were working at the Magna Roman Fort in Northumberland, located near another ancient Roman fort called Vindolanda, when they made the discovery. Many famous artifacts have been unearthed at Vindolanda, including wooden writing tablets and around 5,000 pairs of ancient Roman shoes. The Magna site, it seems, is literally following in those footsteps, with 32 shoes found so far preserved in the fort’s “ankle-breaker” trenches. Originally designed to trip and injure attackers, the trenches ended up being a perfect, anaerobic environment to preserve the shoes.
Roman shoes were made with hand-stitched leather, and many were closed-toed as opposed to the sandals often portrayed in popular media (in fact, sandals were only worn indoors). The ancient Romans were actually expert shoemakers, and their footwear contributed greatly to their military success. Most Roman soldiers wore caligae, leather boots consisting of an outer shell cut into many strips that allowed them to be laced up tightly. Replaceable iron hobnails on the soles helped the boots last longer and provided traction on soft surfaces. These boots were eventually replaced with completely enclosed ones called calcei, but the caligae have left a greater impression on the perception of Roman culture. That’s probably thanks to Caligula, the infamous Roman emperor whose real name was Gaius. When Gaius was a child, he accompanied his father on campaign in a set of kid-sized legionary gear, including the caligae. The soldiers then started calling him “Caligula,” which means “little boots.” Unfortunate, since he had some big shoes to fill as the third emperor of Rome.
[Image description: A detailed, black-and-white illustration of two elaborately-dressed ancient Roman soldiers looking at one another.] Credit & copyright: The Metropolitan Museum of Art, Two Roman Soldiers, Giovanni Francesco Venturini, 17th century. Bequest of Phyllis Massar, 2011. Public Domain.These are some not-so-fresh kicks. Archaeologists in England have unearthed 2,000-year-old pairs of Roman shoes, and they’re some of the best preserved footwear from the era. The researchers were working at the Magna Roman Fort in Northumberland, located near another ancient Roman fort called Vindolanda, when they made the discovery. Many famous artifacts have been unearthed at Vindolanda, including wooden writing tablets and around 5,000 pairs of ancient Roman shoes. The Magna site, it seems, is literally following in those footsteps, with 32 shoes found so far preserved in the fort’s “ankle-breaker” trenches. Originally designed to trip and injure attackers, the trenches ended up being a perfect, anaerobic environment to preserve the shoes.
Roman shoes were made with hand-stitched leather, and many were closed-toed as opposed to the sandals often portrayed in popular media (in fact, sandals were only worn indoors). The ancient Romans were actually expert shoemakers, and their footwear contributed greatly to their military success. Most Roman soldiers wore caligae, leather boots consisting of an outer shell cut into many strips that allowed them to be laced up tightly. Replaceable iron hobnails on the soles helped the boots last longer and provided traction on soft surfaces. These boots were eventually replaced with completely enclosed ones called calcei, but the caligae have left a greater impression on the perception of Roman culture. That’s probably thanks to Caligula, the infamous Roman emperor whose real name was Gaius. When Gaius was a child, he accompanied his father on campaign in a set of kid-sized legionary gear, including the caligae. The soldiers then started calling him “Caligula,” which means “little boots.” Unfortunate, since he had some big shoes to fill as the third emperor of Rome.
[Image description: A detailed, black-and-white illustration of two elaborately-dressed ancient Roman soldiers looking at one another.] Credit & copyright: The Metropolitan Museum of Art, Two Roman Soldiers, Giovanni Francesco Venturini, 17th century. Bequest of Phyllis Massar, 2011. Public Domain.
July 9, 2025
-
9 minFREEWork Business CurioFree5 CQ
The high court has cleared the way for the Trump administration to plan out mass layoffs across the government. It's somewhat of a confounding decision, thou...
The high court has cleared the way for the Trump administration to plan out mass layoffs across the government. It's somewhat of a confounding decision, thou...
-
1 minFREEHumanities Word CurioFree1 CQ
Word of the Day
: July 9, 2025\sim-yuh-LAK-rum\ noun
What It Means
A simulacrum is a superficial likeness of something, usually as an imitati...
with Merriam-WebsterWord of the Day
: July 9, 2025\sim-yuh-LAK-rum\ noun
What It Means
A simulacrum is a superficial likeness of something, usually as an imitati...
-
FREEBiology Nerdy CurioFree1 CQ
These critters are as American as apple pie, but a whole lot bigger! North American bison, also called buffalo, are the largest land animals in North America and some of the most historically significant. Yet, we almost lost them altogether. Overhunted throughout the 19th century, there were fewer than 600 bison left in the U.S. by 1889. Today, their numbers have recovered drastically, but these gentle giants still have a long way to go.
There are two species of bison: The North American bison and the European bison. North American bison are often called buffalo, but they aren’t actually buffalo at all. Real buffalo, like cape buffalo and anoa, live in Africa and Asia. However, bison are closely related to buffalo and share many traits with them, since they’re all bovines—members of the family Bovidae’s subfamily, Bovinae. As such, they share many attributes with buffalo, including their large size, horns, and hooves, as well as behavioral traits like living in herds. Bison are famous for their fluffy winter coats, which help them survive harsh, blizzardy winters in places like the Northern Great Plains. That’s not to say that bison are sweet and cuddly, though. They are massive, powerful animals; males can stand up to six feet tall and weigh up to 2,000 pounds. Like any wild animal, they can become aggressive if approached, especially during mating and calving season. It’s a fact that tourists sometimes learn the hard way when they don’t obey rules in places like Yellowstone National Park, where the largest bison population in North America roams free.
Bison first appeared in North America during the late Middle Pleistocene epoch, between 195,000 and 135,000 years ago. Before European colonists began settling in North America en masse in the late 15th century, there were around 30 million bison roaming in what is now the United States. Many native tribes relied on bison meat and hides, with some, like the Plains Indians, focusing many parts of their lives around the movements of bison herds. However, as colonist aggression toward native tribes increased and native peoples lost control of more and more land, the bison population dwindled. During the American Indian Wars of the 17th, 18th, and early 19th centuries, bison were deliberately killed by colonists as a means of harming native peoples and to feed colonial soldiers. By the 1880s, there were as few as 300 bison left in what is now the United States. The species was on the brink of extinction.
Luckily, private organizations and ranchers stepped in to save North American buffalo, keeping herds on private land where they couldn’t be hunted. In 1902, 21 bison from private owners were placed in a designated area at Yellowstone National Park. Eventually, they were reintroduced to the wild, and began breeding with Yellowstone’s existing wild population. In 1905, the American Bison Society started a bison breeding program that also helped spread awareness about the importance of wild bison. Theodore Roosevelt aligned himself closely with the organization and even served as its honorary president for a time. Today, thanks to over a century of conservation efforts, there are roughly 31,000 wild bison in the United States. It’s a far cry from the millions that once roamed here, but it’s a whole lot better than extinction, and that’s no bison hockey!
[Image description: An adult and baby bison standing on a shrubby plain.] Credit & copyright: Anna Weyers Blades/USFWS. Public Domain.These critters are as American as apple pie, but a whole lot bigger! North American bison, also called buffalo, are the largest land animals in North America and some of the most historically significant. Yet, we almost lost them altogether. Overhunted throughout the 19th century, there were fewer than 600 bison left in the U.S. by 1889. Today, their numbers have recovered drastically, but these gentle giants still have a long way to go.
There are two species of bison: The North American bison and the European bison. North American bison are often called buffalo, but they aren’t actually buffalo at all. Real buffalo, like cape buffalo and anoa, live in Africa and Asia. However, bison are closely related to buffalo and share many traits with them, since they’re all bovines—members of the family Bovidae’s subfamily, Bovinae. As such, they share many attributes with buffalo, including their large size, horns, and hooves, as well as behavioral traits like living in herds. Bison are famous for their fluffy winter coats, which help them survive harsh, blizzardy winters in places like the Northern Great Plains. That’s not to say that bison are sweet and cuddly, though. They are massive, powerful animals; males can stand up to six feet tall and weigh up to 2,000 pounds. Like any wild animal, they can become aggressive if approached, especially during mating and calving season. It’s a fact that tourists sometimes learn the hard way when they don’t obey rules in places like Yellowstone National Park, where the largest bison population in North America roams free.
Bison first appeared in North America during the late Middle Pleistocene epoch, between 195,000 and 135,000 years ago. Before European colonists began settling in North America en masse in the late 15th century, there were around 30 million bison roaming in what is now the United States. Many native tribes relied on bison meat and hides, with some, like the Plains Indians, focusing many parts of their lives around the movements of bison herds. However, as colonist aggression toward native tribes increased and native peoples lost control of more and more land, the bison population dwindled. During the American Indian Wars of the 17th, 18th, and early 19th centuries, bison were deliberately killed by colonists as a means of harming native peoples and to feed colonial soldiers. By the 1880s, there were as few as 300 bison left in what is now the United States. The species was on the brink of extinction.
Luckily, private organizations and ranchers stepped in to save North American buffalo, keeping herds on private land where they couldn’t be hunted. In 1902, 21 bison from private owners were placed in a designated area at Yellowstone National Park. Eventually, they were reintroduced to the wild, and began breeding with Yellowstone’s existing wild population. In 1905, the American Bison Society started a bison breeding program that also helped spread awareness about the importance of wild bison. Theodore Roosevelt aligned himself closely with the organization and even served as its honorary president for a time. Today, thanks to over a century of conservation efforts, there are roughly 31,000 wild bison in the United States. It’s a far cry from the millions that once roamed here, but it’s a whole lot better than extinction, and that’s no bison hockey!
[Image description: An adult and baby bison standing on a shrubby plain.] Credit & copyright: Anna Weyers Blades/USFWS. Public Domain. -
FREEMind + Body Daily Curio #3113Free1 CQ
It’s not always good to go out with a bang. Heart attacks were once the number one cause of deaths in the world, but a recent study shows that the tides are changing. In the last half-century or so, the number of heart attacks has been in sharp decline. Consider the following statistic from Stanford Medicine researchers: a person over the age of 65 admitted to a hospital in 1970 had just a 60 percent chance of leaving alive, and the most likely cause of death would have been an acute myocardial infarctions, otherwise known as a heart attack. Since then, the numbers have shifted drastically. Heart disease used to account for 41 percent of all deaths in the U.S., but that number is now down to 24 percent. Deaths from heart attacks, specifically, have fallen by an astonishing 90 percent. There are a few reasons for this change, the first being that medical technology has simply advanced, giving doctors better tools with which to help their patients, including better drugs. Another reason is that more people have become health-conscious, eating better, exercising more, and smoking less. Younger Americans are also drinking less alcohol, which might continue to improve the nation’s overall heart health. More people know how to perform CPR now too, and those that don’t can easily look it up within seconds thanks to smartphones. This makes cardiac arrest itself less deadly than it once was. Nowadays, instead of heart attacks, more people are dying from chronic heart conditions. That might not sound like a good thing, but it’s ultimately a positive sign. As the lead author of the study, Sara King, said in a statement, “People now are surviving these acute events, so they have the opportunity to develop these other heart conditions.” Is it really a trade-off if the cost of not dying younger is dying older?
[Image description: A digital illustration of a cartoon heart with a break down the center. The heart is maroon, the background is red.] Credit & copyright: Author-created image. Public domain.It’s not always good to go out with a bang. Heart attacks were once the number one cause of deaths in the world, but a recent study shows that the tides are changing. In the last half-century or so, the number of heart attacks has been in sharp decline. Consider the following statistic from Stanford Medicine researchers: a person over the age of 65 admitted to a hospital in 1970 had just a 60 percent chance of leaving alive, and the most likely cause of death would have been an acute myocardial infarctions, otherwise known as a heart attack. Since then, the numbers have shifted drastically. Heart disease used to account for 41 percent of all deaths in the U.S., but that number is now down to 24 percent. Deaths from heart attacks, specifically, have fallen by an astonishing 90 percent. There are a few reasons for this change, the first being that medical technology has simply advanced, giving doctors better tools with which to help their patients, including better drugs. Another reason is that more people have become health-conscious, eating better, exercising more, and smoking less. Younger Americans are also drinking less alcohol, which might continue to improve the nation’s overall heart health. More people know how to perform CPR now too, and those that don’t can easily look it up within seconds thanks to smartphones. This makes cardiac arrest itself less deadly than it once was. Nowadays, instead of heart attacks, more people are dying from chronic heart conditions. That might not sound like a good thing, but it’s ultimately a positive sign. As the lead author of the study, Sara King, said in a statement, “People now are surviving these acute events, so they have the opportunity to develop these other heart conditions.” Is it really a trade-off if the cost of not dying younger is dying older?
[Image description: A digital illustration of a cartoon heart with a break down the center. The heart is maroon, the background is red.] Credit & copyright: Author-created image. Public domain.
July 8, 2025
-
8 minFREEWork Business CurioFree5 CQ
From the BBC World Service: 14 countries received a letter from the White House saying a pause on tariffs due to expire Wednesday will now be extended to Aug...
From the BBC World Service: 14 countries received a letter from the White House saying a pause on tariffs due to expire Wednesday will now be extended to Aug...
-
1 minFREEHumanities Word CurioFree1 CQ
Word of the Day
: July 8, 2025\ig-ZEM-pluh-ree\ adjective
What It Means
Something described as exemplary is extremely good and deserves to be...
with Merriam-WebsterWord of the Day
: July 8, 2025\ig-ZEM-pluh-ree\ adjective
What It Means
Something described as exemplary is extremely good and deserves to be...
-
FREEMusic Appreciation Song CurioFree2 CQ
This here’s one rootin’, tootin’, high-falootin’ musical. On this day in 1958, the soundtrack for Rodgers and Hammerstein’s musical Oklahoma! won the Recording Industry Association of America’s (RIAA) first-ever Gold Album. That’s not the only way in which Oklahoma! was the first of its kind—it was also the first musical that Rodgers and Hammerstein ever worked on together, and their first major hit. The album’s titular track is, fittingly, one of the best-loved songs of the entire show. Sung mainly by the character Curly McLain (played by Gordon MacRae in the film version) the song celebrates not only a wedding, but the impending statehood of Oklahoma and everything about living there. A true classic musical number, Oklahoma! is big and showy, with the large ensemble cast joining in to sing some of the show’s most iconic and energetic lines. These include “Oooooklahoma, where the wind comes sweeping down the plain” and “...You’re doin’ fine, Oklahoma! Oklahoma, O.K.!” Oklahoma might not be a top travel destination for most people, but as far as Hollywood and the RIAA are concerned, you should think twice before calling it a flyover state.
This here’s one rootin’, tootin’, high-falootin’ musical. On this day in 1958, the soundtrack for Rodgers and Hammerstein’s musical Oklahoma! won the Recording Industry Association of America’s (RIAA) first-ever Gold Album. That’s not the only way in which Oklahoma! was the first of its kind—it was also the first musical that Rodgers and Hammerstein ever worked on together, and their first major hit. The album’s titular track is, fittingly, one of the best-loved songs of the entire show. Sung mainly by the character Curly McLain (played by Gordon MacRae in the film version) the song celebrates not only a wedding, but the impending statehood of Oklahoma and everything about living there. A true classic musical number, Oklahoma! is big and showy, with the large ensemble cast joining in to sing some of the show’s most iconic and energetic lines. These include “Oooooklahoma, where the wind comes sweeping down the plain” and “...You’re doin’ fine, Oklahoma! Oklahoma, O.K.!” Oklahoma might not be a top travel destination for most people, but as far as Hollywood and the RIAA are concerned, you should think twice before calling it a flyover state.
-
FREEBiology Daily Curio #3112Free1 CQ
The Earth is teeming with life and, apparantly, with “not-life” as well. Scientists have discovered a new type of organism that appears to defy the standard definition of “life.” All living things are organisms, but not all organisms are living. Take viruses, for instance. While viruses are capable of reproducing, they can’t do so on their own. They require a host organism to perform the biological functions necessary to reproduce. Viruses also can’t produce energy on their own or grow, unlike even simple living things, like bacteria. Now, there’s the matter of Sukunaarchaeum mirabile. The organism was discovered by accident by a team of Canadian and Japanese researchers who were looking into the DNA of Citharistes regius, a species of plankton. When they noticed a loop of DNA that didn’t belong to the plankton, they took a closer look and found Sukunaarchaeum. In some ways, this new organism resembles a virus. It can’t grow, produce energy, or reproduce on its own, but it has one distinct feature that sets it apart: it can produce its own ribosomes, messenger RNA, and transfer RNA. That latter part makes it more like a bacterium than a virus.
Then there’s the matter of its genetics. Sukunaarchaeum, it seems, is a genetic lightweight with only 238,000 base pairs of DNA. Compare that to a typical virus, which can range from 735,000 to 2.5 million base pairs, and the low number really stands out. Nearly all of Sukunaarchaeum’s genes are made to work toward the singular goal of replicating the organism. In a way, Sukunaarchaeum appears to be somewhere between a virus and a bacteria in terms of how “alive” it is, indicating that life itself exists on a spectrum. In science, nothing is as simple as it first appears.The Earth is teeming with life and, apparantly, with “not-life” as well. Scientists have discovered a new type of organism that appears to defy the standard definition of “life.” All living things are organisms, but not all organisms are living. Take viruses, for instance. While viruses are capable of reproducing, they can’t do so on their own. They require a host organism to perform the biological functions necessary to reproduce. Viruses also can’t produce energy on their own or grow, unlike even simple living things, like bacteria. Now, there’s the matter of Sukunaarchaeum mirabile. The organism was discovered by accident by a team of Canadian and Japanese researchers who were looking into the DNA of Citharistes regius, a species of plankton. When they noticed a loop of DNA that didn’t belong to the plankton, they took a closer look and found Sukunaarchaeum. In some ways, this new organism resembles a virus. It can’t grow, produce energy, or reproduce on its own, but it has one distinct feature that sets it apart: it can produce its own ribosomes, messenger RNA, and transfer RNA. That latter part makes it more like a bacterium than a virus.
Then there’s the matter of its genetics. Sukunaarchaeum, it seems, is a genetic lightweight with only 238,000 base pairs of DNA. Compare that to a typical virus, which can range from 735,000 to 2.5 million base pairs, and the low number really stands out. Nearly all of Sukunaarchaeum’s genes are made to work toward the singular goal of replicating the organism. In a way, Sukunaarchaeum appears to be somewhere between a virus and a bacteria in terms of how “alive” it is, indicating that life itself exists on a spectrum. In science, nothing is as simple as it first appears.
July 7, 2025
-
2 minFREEHumanities Word CurioFree2 CQ
Word of the Day
: July 7, 2025\pruh-KRASS-tuh-nayt\ verb
What It Means
To procrastinate is to be slow or late about doing something that shou...
with Merriam-WebsterWord of the Day
: July 7, 2025\pruh-KRASS-tuh-nayt\ verb
What It Means
To procrastinate is to be slow or late about doing something that shou...
-
9 minFREEWork Business CurioFree5 CQ
The House of Representatives could vote as soon as today on President Donald Trump’s big tax and spending bill. Trump says the legislation gets rid of taxes ...
The House of Representatives could vote as soon as today on President Donald Trump’s big tax and spending bill. Trump says the legislation gets rid of taxes ...
-
FREEArt Appreciation Art CurioFree1 CQ
If you want to impress your Iron Age friends, you need one of these bad boys. Torcs (also spelled torques) are a kind of rigid necklace or neck ring. They were commonly worn by Celts throughout western Europe from around 1200 BCE to 550 BCE, but they weren’t all made from solid gold. The photo above shows a round, golden torc necklace. The body of the torc is twisted into an ornamental design and the ends are rolled to create a rounded point. Torcs were a symbol of wealth and social status amongst the Celts, depending on the materials used and the complexity of the design. Torcs could be made of any metal familiar to Iron Age jewelers, including silver, bronze, and copper, and they often featured etched details depicting mythical beings. They also served as a way of safekeeping and keeping track of wealth. Some torcs of solid gold could weigh several pounds, and torcs are often found in Celtic burial sites. The dead don’t speak, but they can still torc.
Gold Neck Ring, Celtic, 6th–4th century BCE, Gold, 7.5 x 7.5 x .5 in. (19 x 19.1 x 1.2 cm.), The Metropolitan Museum of Art, New York City, New York
[Image credit & copyright: The Metropolitan Museum of Art, Purchase, 2005 Benefit Fund, Rogers Fund, Audrey Love Charitable Foundation Gift, and Gifts of J. Pierpont Morgan and George Blumenthal and Fletcher Fund, by exchange, 2005. Public Domain.If you want to impress your Iron Age friends, you need one of these bad boys. Torcs (also spelled torques) are a kind of rigid necklace or neck ring. They were commonly worn by Celts throughout western Europe from around 1200 BCE to 550 BCE, but they weren’t all made from solid gold. The photo above shows a round, golden torc necklace. The body of the torc is twisted into an ornamental design and the ends are rolled to create a rounded point. Torcs were a symbol of wealth and social status amongst the Celts, depending on the materials used and the complexity of the design. Torcs could be made of any metal familiar to Iron Age jewelers, including silver, bronze, and copper, and they often featured etched details depicting mythical beings. They also served as a way of safekeeping and keeping track of wealth. Some torcs of solid gold could weigh several pounds, and torcs are often found in Celtic burial sites. The dead don’t speak, but they can still torc.
Gold Neck Ring, Celtic, 6th–4th century BCE, Gold, 7.5 x 7.5 x .5 in. (19 x 19.1 x 1.2 cm.), The Metropolitan Museum of Art, New York City, New York
[Image credit & copyright: The Metropolitan Museum of Art, Purchase, 2005 Benefit Fund, Rogers Fund, Audrey Love Charitable Foundation Gift, and Gifts of J. Pierpont Morgan and George Blumenthal and Fletcher Fund, by exchange, 2005. Public Domain. -
FREEAstronomy Daily Curio #3111Free1 CQ
Don’t hold your breath for moon dust. Long thought to be toxic, new research shows that moon dust may be relatively harmless compared to what’s already here on Earth. While the dusty surface of the moon looks beautiful and its name sounds like a whimsical ingredient in a fairy tale potion, it was a thorn in the side of lunar explorers during the Apollo missions. NASA astronauts who traversed the moon’s dusty surface reported symptoms like nasal congestion and sneezing, which they began calling “lunar hay fever.” They also reported that moon dust smelled like burnt gunpowder, and while an unpleasant smell isn’t necessarily bad for one’s health, it couldn’t have been comforting. These symptoms were likely caused by the abrasive nature of moon dust particles, which are never smoothed out by wind or water the way they would be on Earth. The particles are also small, so they’re very hard to keep out of spacesuits and away from equipment. Then there’s the matter of the moon’s low gravity, which allows moon dust to float around for longer than it would on Earth, making it more likely to penetrate spacesuit’s seals and be inhaled into the lungs. There, like asbestos, the dust can cause tiny cuts that can lead to respiratory problems and even cancer…at least, that’s what everyone thought until recently. Researchers at the University of Technology Sydney (UTS) just published a paper claiming that moon dust might not be so dangerous after all. They believe that the dust will likely cause short-term symptoms without leading to long-term damage. Using simulated moon dust and real human lungs, they found that moon dust was less dangerous than many air pollutants found on Earth. For instance, silica (typically found on construction sites) is much more dangerous, as it can cause silicosis by lingering in the lungs, leading to scarring and lesions. Astronauts headed to the moon in the future can breathe a sigh of relief—but it may be safer to wait until they get there.
[Image description: A moon surrounded by orange-ish hazy clouds against a black sky.] Credit & copyright: Cbaile19, Wikimedia Commons. This file is made available under the Creative Commons CC0 1.0 Universal Public Domain Dedication.Don’t hold your breath for moon dust. Long thought to be toxic, new research shows that moon dust may be relatively harmless compared to what’s already here on Earth. While the dusty surface of the moon looks beautiful and its name sounds like a whimsical ingredient in a fairy tale potion, it was a thorn in the side of lunar explorers during the Apollo missions. NASA astronauts who traversed the moon’s dusty surface reported symptoms like nasal congestion and sneezing, which they began calling “lunar hay fever.” They also reported that moon dust smelled like burnt gunpowder, and while an unpleasant smell isn’t necessarily bad for one’s health, it couldn’t have been comforting. These symptoms were likely caused by the abrasive nature of moon dust particles, which are never smoothed out by wind or water the way they would be on Earth. The particles are also small, so they’re very hard to keep out of spacesuits and away from equipment. Then there’s the matter of the moon’s low gravity, which allows moon dust to float around for longer than it would on Earth, making it more likely to penetrate spacesuit’s seals and be inhaled into the lungs. There, like asbestos, the dust can cause tiny cuts that can lead to respiratory problems and even cancer…at least, that’s what everyone thought until recently. Researchers at the University of Technology Sydney (UTS) just published a paper claiming that moon dust might not be so dangerous after all. They believe that the dust will likely cause short-term symptoms without leading to long-term damage. Using simulated moon dust and real human lungs, they found that moon dust was less dangerous than many air pollutants found on Earth. For instance, silica (typically found on construction sites) is much more dangerous, as it can cause silicosis by lingering in the lungs, leading to scarring and lesions. Astronauts headed to the moon in the future can breathe a sigh of relief—but it may be safer to wait until they get there.
[Image description: A moon surrounded by orange-ish hazy clouds against a black sky.] Credit & copyright: Cbaile19, Wikimedia Commons. This file is made available under the Creative Commons CC0 1.0 Universal Public Domain Dedication.
July 6, 2025
-
2 minFREEHumanities Word CurioFree2 CQ
Word of the Day
: July 6, 2025\AN-tik\ noun
What It Means
Antic refers to an attention-drawing, often wildly playful or funny act or action. ...
with Merriam-WebsterWord of the Day
: July 6, 2025\AN-tik\ noun
What It Means
Antic refers to an attention-drawing, often wildly playful or funny act or action. ...
-
12 minFREEWork Business CurioFree7 CQ
The government reported today that 147,000 more people were on payrolls in June compared to May — a stronger outcome than initially forecasted. This data com...
The government reported today that 147,000 more people were on payrolls in June compared to May — a stronger outcome than initially forecasted. This data com...
-
FREEHumanities PP&T CurioFree1 CQ
They say that a dog is man’s best friend, but there’s one thing that can get in the way of that friendship like nothing else. For thousands of years, rabies has claimed countless lives, often transmitted to humans via dogs, raccoons, foxes, and other mammals. For most of that time, there was no way to directly prevent the transmission of rabies, until French scientist Louis Pasteur managed to successfully inoculate someone against the disease on this day in 1885.
Rabies has always been a disease without a cure. Even the ancient Sumerians knew about the deadly disease and how it could be transmitted through a bite from an infected animal. It was a common enough problem that the Babylonians had specific regulations on how the owner of a rabid dog was to compensate a victim’s family in the event of a bite. The disease itself is caused by a virus which is expressed in the saliva, and causes the infected animal to behave in an agitated or aggressive manner. Symptoms across species remain similar, and when humans are infected, they show signs of agitation, hyperactivity, fever, nausea, confusion, and the same excessive salivation seen in other animals. In advanced stages, victims begin hallucinating and having difficulty swallowing. The latter symptom also leads to a fear of water. Rabies is almost always fatal without intervention. Fortunately, post-exposure prophylaxis against rabies now exists, thanks to the efforts of one scientist.
By the time 9-year-old Joseph Meister was bitten 14 times by a rabid dog in the town of Alsace, French chemist Louis Pasteur was already working with rabid dogs. Pasteur had developed a rabies vaccine which he was administering to dogs and rabbits. Though it showed promise, it had never been tested on human subjects. It had also never been used on a subject who had already been infected. When Joseph’s mother brought the child to Paris to seek treatment from Pasteur, he and his colleagues didn’t want to administer the vaccine due to its untested nature. That might have been the end for the young Joseph but for Dr. Jacques Joseph Grancher’s intervention. Grancher offered to administer the vaccine on the boy himself, and over the course of 10 days, Joseph received 12 doses. Remarkably, Joseph was cured by the end of the month, proving the vaccine’s efficacy as both a preventative and a treatment. While credit for developing the vaccine goes to Pasteur, Grancher was also recognized for his part in ending the era of rabies as an automatic death sentence. In 1888, Grancher was given the rank of Grand Officer of the Legion of Honor, the highest French honor given at the time to civilians or military personnel.
The rabies vaccine and post-exposure prophylaxis have greatly improved since Pasteur’s time, and they’re no longer as grueling to receive as they once were. Still, rabies remains a dangerous disease. Luckily, cases are few and far between nowadays, with only around ten fatalities a year in North America thanks to decades of wildlife vaccination efforts. Most cases are spread by infected raccoons, foxes, bats, or skunks, as most pet dogs are vaccinated against rabies. In the rare instance that someone is infected and unable to receive a post-exposure prophylaxis quickly, the disease is still almost always fatal. Once symptoms start showing, it’s already too late. In a way, rabies still hasn’t been put to Pasteur.
[Image description: A raccoon poking its head out from underneath a large wooden beam.] Credit & copyright: Poivrier, Wikimedia Commons. The copyright holder of this work has released it into the public domain. This applies worldwide.They say that a dog is man’s best friend, but there’s one thing that can get in the way of that friendship like nothing else. For thousands of years, rabies has claimed countless lives, often transmitted to humans via dogs, raccoons, foxes, and other mammals. For most of that time, there was no way to directly prevent the transmission of rabies, until French scientist Louis Pasteur managed to successfully inoculate someone against the disease on this day in 1885.
Rabies has always been a disease without a cure. Even the ancient Sumerians knew about the deadly disease and how it could be transmitted through a bite from an infected animal. It was a common enough problem that the Babylonians had specific regulations on how the owner of a rabid dog was to compensate a victim’s family in the event of a bite. The disease itself is caused by a virus which is expressed in the saliva, and causes the infected animal to behave in an agitated or aggressive manner. Symptoms across species remain similar, and when humans are infected, they show signs of agitation, hyperactivity, fever, nausea, confusion, and the same excessive salivation seen in other animals. In advanced stages, victims begin hallucinating and having difficulty swallowing. The latter symptom also leads to a fear of water. Rabies is almost always fatal without intervention. Fortunately, post-exposure prophylaxis against rabies now exists, thanks to the efforts of one scientist.
By the time 9-year-old Joseph Meister was bitten 14 times by a rabid dog in the town of Alsace, French chemist Louis Pasteur was already working with rabid dogs. Pasteur had developed a rabies vaccine which he was administering to dogs and rabbits. Though it showed promise, it had never been tested on human subjects. It had also never been used on a subject who had already been infected. When Joseph’s mother brought the child to Paris to seek treatment from Pasteur, he and his colleagues didn’t want to administer the vaccine due to its untested nature. That might have been the end for the young Joseph but for Dr. Jacques Joseph Grancher’s intervention. Grancher offered to administer the vaccine on the boy himself, and over the course of 10 days, Joseph received 12 doses. Remarkably, Joseph was cured by the end of the month, proving the vaccine’s efficacy as both a preventative and a treatment. While credit for developing the vaccine goes to Pasteur, Grancher was also recognized for his part in ending the era of rabies as an automatic death sentence. In 1888, Grancher was given the rank of Grand Officer of the Legion of Honor, the highest French honor given at the time to civilians or military personnel.
The rabies vaccine and post-exposure prophylaxis have greatly improved since Pasteur’s time, and they’re no longer as grueling to receive as they once were. Still, rabies remains a dangerous disease. Luckily, cases are few and far between nowadays, with only around ten fatalities a year in North America thanks to decades of wildlife vaccination efforts. Most cases are spread by infected raccoons, foxes, bats, or skunks, as most pet dogs are vaccinated against rabies. In the rare instance that someone is infected and unable to receive a post-exposure prophylaxis quickly, the disease is still almost always fatal. Once symptoms start showing, it’s already too late. In a way, rabies still hasn’t been put to Pasteur.
[Image description: A raccoon poking its head out from underneath a large wooden beam.] Credit & copyright: Poivrier, Wikimedia Commons. The copyright holder of this work has released it into the public domain. This applies worldwide.
July 5, 2025
-
8 minFREEWork Business CurioFree5 CQ
From the BBC World Service: President Trump’s so-called One Big Beautiful Bill, has squeaked through Congress. It boosts military and border spending and ext...
From the BBC World Service: President Trump’s so-called One Big Beautiful Bill, has squeaked through Congress. It boosts military and border spending and ext...
-
FREESports Sporty CurioFree1 CQ
Speed isn’t everything in baseball, but it sure does count for a lot. Shohei Ohtani just threw the fastest pitch of his career, and it’s hard to believe that even this star pitcher’s triple-digit speed doesn’t make the list of the world’s fastest pitches. No one envies those who have to come up to bat against Ohtani, who appears to be coming back to the pitcher’s mound with a vengeance. He recently pitched for two innings against the Kansas City Royals. In addition to allowing one hit and one walk during those innings, he also unleashed a 101.7 mph fastball against Vinnie Pasquantino. It’s the fastest of Ohtani’s career, and it’s certainly impressive, but the honor for the fastest-ever fastball goes to Aroldis Chapman of the Boston Red Sox with 105.8 mph. In fact, Chapman boasts eight spots out of the top ten, with the “slowest” pitch in number nine at 105.1 mph. The other record holders are Ben Joyce (105.5 mph) in third place and Jordan Hicks (105 mph) in tenth. There’s also been an “arms” race of sorts in the MLB, with the average fastball speed going up from 89 mph to over 94 mph since 2000. Kudos to the pitchers for their speed, and congrats to the batters who can hit those fastballs.
Speed isn’t everything in baseball, but it sure does count for a lot. Shohei Ohtani just threw the fastest pitch of his career, and it’s hard to believe that even this star pitcher’s triple-digit speed doesn’t make the list of the world’s fastest pitches. No one envies those who have to come up to bat against Ohtani, who appears to be coming back to the pitcher’s mound with a vengeance. He recently pitched for two innings against the Kansas City Royals. In addition to allowing one hit and one walk during those innings, he also unleashed a 101.7 mph fastball against Vinnie Pasquantino. It’s the fastest of Ohtani’s career, and it’s certainly impressive, but the honor for the fastest-ever fastball goes to Aroldis Chapman of the Boston Red Sox with 105.8 mph. In fact, Chapman boasts eight spots out of the top ten, with the “slowest” pitch in number nine at 105.1 mph. The other record holders are Ben Joyce (105.5 mph) in third place and Jordan Hicks (105 mph) in tenth. There’s also been an “arms” race of sorts in the MLB, with the average fastball speed going up from 89 mph to over 94 mph since 2000. Kudos to the pitchers for their speed, and congrats to the batters who can hit those fastballs.
July 4, 2025
-
FREEMind + Body Daily CurioFree1 CQ
Happy Fourth of July! This year, we’re highlighting a food that’s as American as apple pie…actually, much more so. Chicken and waffles is a U.S.-born, soul food staple, but exactly where, when, and how it developed is a source of heated debate.
Chicken and waffles is exactly what its name implies: a dish of waffles, usually served with butter and maple syrup, alongside fried chicken. The chicken is dredged in seasoned flour before cooking, and the exact spices used in the dredge vary from recipe to recipe. Black pepper, paprika, garlic powder, and onion powder are all common choices. The exact pieces of chicken served, whether breast meat, wings, or thighs, also varies. Sometimes, honey is substituted for syrup.
The early history of chicken and waffles is shrouded in mystery. Though there’s no doubt that it’s an American dish, there are different stories about exactly how it developed. Some say that it came about in Jazz Age Harlem, when partiers and theater-goers stayed out so late that they craved a combination of breakfast and dinner foods. This story fits with chicken and waffles’ modern designation as soul food, since Harlem was largely segregated during the Jazz Age, and soul food comes from the culinary traditions of Black Americans. Still, others say that the dish was actually made famous by founding father Thomas Jefferson, who popularized waffles after he purchased waffle irons (which were fairly expensive at the time) from Amsterdam in the 1780s. Another story holds that the Pennsylvania Dutch created chicken and waffles based on German traditions.
Though we’ll never know for certain, it’s likely that all three tales are simply parts of a larger story. Dutch colonists brought waffles to the U.S. as early as the 1600s, where they made their way into the new culinary traditions of different groups of European settlers. This included the “Pennsylvania Dutch”, who were actually from Germany, where it was common to eat meat with bread or biscuits to sop up juices. They served waffles with different types of meat, including chicken with a creamy sauce. Thomas Jefferson did, indeed, help to popularize waffles, but it was the enslaved people who cooked for him and other colonists who changed the dish into what it is today. They standardized the use of seasoned, sometimes even spicy, fried chicken served with waffles, pancakes, or biscuits. After the civil war, chicken and waffles fell out of favor with white Americans, but was still frequently served in Black-owned restaurants, including well-known establishments in Harlem and in Black communities throughout the South. For centuries, the dish was categorized as Southern soul food. Then, in the 1990s, chicken and waffles had a sudden surge in nationwide popularity, possibly due to the rise of food-centric T.V. and “foodie” culture. Today, it can be found everywhere from Southern soul food restaurants to swanky brunch cafes in northern states. Its origins were humble, but its delicious reach is undeniable.
[Image description: Chicken wings and a waffle on a white plate with an orange slice.] Credit & copyright: Joost.janssens, Wikimedia Commons. This work has been released into the public domain by its author, Joost.janssens at English Wikipedia. This applies worldwide.Happy Fourth of July! This year, we’re highlighting a food that’s as American as apple pie…actually, much more so. Chicken and waffles is a U.S.-born, soul food staple, but exactly where, when, and how it developed is a source of heated debate.
Chicken and waffles is exactly what its name implies: a dish of waffles, usually served with butter and maple syrup, alongside fried chicken. The chicken is dredged in seasoned flour before cooking, and the exact spices used in the dredge vary from recipe to recipe. Black pepper, paprika, garlic powder, and onion powder are all common choices. The exact pieces of chicken served, whether breast meat, wings, or thighs, also varies. Sometimes, honey is substituted for syrup.
The early history of chicken and waffles is shrouded in mystery. Though there’s no doubt that it’s an American dish, there are different stories about exactly how it developed. Some say that it came about in Jazz Age Harlem, when partiers and theater-goers stayed out so late that they craved a combination of breakfast and dinner foods. This story fits with chicken and waffles’ modern designation as soul food, since Harlem was largely segregated during the Jazz Age, and soul food comes from the culinary traditions of Black Americans. Still, others say that the dish was actually made famous by founding father Thomas Jefferson, who popularized waffles after he purchased waffle irons (which were fairly expensive at the time) from Amsterdam in the 1780s. Another story holds that the Pennsylvania Dutch created chicken and waffles based on German traditions.
Though we’ll never know for certain, it’s likely that all three tales are simply parts of a larger story. Dutch colonists brought waffles to the U.S. as early as the 1600s, where they made their way into the new culinary traditions of different groups of European settlers. This included the “Pennsylvania Dutch”, who were actually from Germany, where it was common to eat meat with bread or biscuits to sop up juices. They served waffles with different types of meat, including chicken with a creamy sauce. Thomas Jefferson did, indeed, help to popularize waffles, but it was the enslaved people who cooked for him and other colonists who changed the dish into what it is today. They standardized the use of seasoned, sometimes even spicy, fried chicken served with waffles, pancakes, or biscuits. After the civil war, chicken and waffles fell out of favor with white Americans, but was still frequently served in Black-owned restaurants, including well-known establishments in Harlem and in Black communities throughout the South. For centuries, the dish was categorized as Southern soul food. Then, in the 1990s, chicken and waffles had a sudden surge in nationwide popularity, possibly due to the rise of food-centric T.V. and “foodie” culture. Today, it can be found everywhere from Southern soul food restaurants to swanky brunch cafes in northern states. Its origins were humble, but its delicious reach is undeniable.
[Image description: Chicken wings and a waffle on a white plate with an orange slice.] Credit & copyright: Joost.janssens, Wikimedia Commons. This work has been released into the public domain by its author, Joost.janssens at English Wikipedia. This applies worldwide.