Curio Cabinet
- By Date
- By Type
August 7, 2025
-
FREEBiology Nerdy CurioFree1 CQ
Whether you’re a human being or a cockroach, pregnancy is draining. According to a paper published in the Journal of Experimental Biology, researchers at the University of Cincinnati have discovered that some species of cockroaches need more sleep when they’re pregnant, just like people. The Pacific beetle-mimic cockroach (Diploptera punctata) is something of an oddball among insects. Instead of laying eggs like most roaches, they give live birth, but that’s not all. During the three-month gestation period, these roaches feed their young from a broodsac using milk protein. This is called viviparity, and it’s somewhat similar to the way mammals use a placenta to nourish their young during gestation. The similarities to mammals don’t end there: just like human mothers need to get plenty of sleep, these roaches also require sleep for healthier gestation and offspring. The need for rest is so important that, according to the research, pregnant D. punctata don’t travel as far in search of food, indicating an aversion to risk-taking behavior. The relationship between sleep and pregnancy complications in humans are poorly understood, which is why the roaches are of such interest to researchers. In humans and mammals in general, sleep disturbances can significantly impact embryo development, and if similar issues affect D. punctata, studying the roaches might give some clues as to why. No matter the species, being a mom is hard work.
[Image description: A close-up photo of a brown cockroach.] Credit & copyright: Junkyardsparkle, Wikimedia Commons. This file is made available under the Creative Commons CC0 1.0 Universal Public Domain Dedication.
Whether you’re a human being or a cockroach, pregnancy is draining. According to a paper published in the Journal of Experimental Biology, researchers at the University of Cincinnati have discovered that some species of cockroaches need more sleep when they’re pregnant, just like people. The Pacific beetle-mimic cockroach (Diploptera punctata) is something of an oddball among insects. Instead of laying eggs like most roaches, they give live birth, but that’s not all. During the three-month gestation period, these roaches feed their young from a broodsac using milk protein. This is called viviparity, and it’s somewhat similar to the way mammals use a placenta to nourish their young during gestation. The similarities to mammals don’t end there: just like human mothers need to get plenty of sleep, these roaches also require sleep for healthier gestation and offspring. The need for rest is so important that, according to the research, pregnant D. punctata don’t travel as far in search of food, indicating an aversion to risk-taking behavior. The relationship between sleep and pregnancy complications in humans are poorly understood, which is why the roaches are of such interest to researchers. In humans and mammals in general, sleep disturbances can significantly impact embryo development, and if similar issues affect D. punctata, studying the roaches might give some clues as to why. No matter the species, being a mom is hard work.
[Image description: A close-up photo of a brown cockroach.] Credit & copyright: Junkyardsparkle, Wikimedia Commons. This file is made available under the Creative Commons CC0 1.0 Universal Public Domain Dedication.
-
FREEMind + Body Daily Curio #3130Free1 CQ
There’s a fine line between getting a tan and cooking yourself in the summer sun. Sunburns can happen even when precautions like sunscreen are taken, especially if one forgets to reapply. Luckily, there’s a widely-accepted sunburn remedy: aloe. But does aloe gel (or cuttings directly from the plant itself) actually help soothe or heal sunburns? The answer isn’t as straightforward as it seems.
Using aloe for minor burns seems like a sensible thing to do. After all, aloe is used in a variety of cosmetic products and is touted to be good for the skin. Aloe is also chock full of antioxidants and has anti-inflammatory properties, so it stands to reason that it would help. Yet, in controlled studies, aloe products have actually been found to be no better than a placebo when it comes to treating sunburns. The only thing that aloe has been proven to do is provide temporary relief by acting as a cooling agent—and only for a short while. That might be the best anyone can hope for, though. Burns of all kinds are notoriously difficult to heal via medical interventions. When it comes to sunburns, the two best remedies are time and preventing further damage. Aloe does act as a gentle moisturizer when heavier products based on petroleum should be avoided, and moisture can help protect the skin’s surface.
If a sunburn is severe and aloe just isn’t cutting it, experts recommend the use of nonsteroidal anti-inflammatory drugs (NSAID) like ibuprofen to help with the pain and inflammation, while hydrocortisone cream might be called for in more extreme cases. Ultimately though, preventative measures like strong sunscreen (reapplied every two hours) and protective clothing is everything when it comes to sunburns. There’s no remedy under the sun that can completely fix it once the damage is done.
[Image description: Several aloe plants, Aloe castanea growing outdoors in a botanical garden.] Credit & copyright: Tangopaso, Wikimedia Commons, Jardin Exotique de Monaco. The copyright holder of this work has released it into the public domain. This applies worldwide.There’s a fine line between getting a tan and cooking yourself in the summer sun. Sunburns can happen even when precautions like sunscreen are taken, especially if one forgets to reapply. Luckily, there’s a widely-accepted sunburn remedy: aloe. But does aloe gel (or cuttings directly from the plant itself) actually help soothe or heal sunburns? The answer isn’t as straightforward as it seems.
Using aloe for minor burns seems like a sensible thing to do. After all, aloe is used in a variety of cosmetic products and is touted to be good for the skin. Aloe is also chock full of antioxidants and has anti-inflammatory properties, so it stands to reason that it would help. Yet, in controlled studies, aloe products have actually been found to be no better than a placebo when it comes to treating sunburns. The only thing that aloe has been proven to do is provide temporary relief by acting as a cooling agent—and only for a short while. That might be the best anyone can hope for, though. Burns of all kinds are notoriously difficult to heal via medical interventions. When it comes to sunburns, the two best remedies are time and preventing further damage. Aloe does act as a gentle moisturizer when heavier products based on petroleum should be avoided, and moisture can help protect the skin’s surface.
If a sunburn is severe and aloe just isn’t cutting it, experts recommend the use of nonsteroidal anti-inflammatory drugs (NSAID) like ibuprofen to help with the pain and inflammation, while hydrocortisone cream might be called for in more extreme cases. Ultimately though, preventative measures like strong sunscreen (reapplied every two hours) and protective clothing is everything when it comes to sunburns. There’s no remedy under the sun that can completely fix it once the damage is done.
[Image description: Several aloe plants, Aloe castanea growing outdoors in a botanical garden.] Credit & copyright: Tangopaso, Wikimedia Commons, Jardin Exotique de Monaco. The copyright holder of this work has released it into the public domain. This applies worldwide.
August 6, 2025
-
FREEBiology Nerdy CurioFree1 CQ
Even two dozen limbs can’t help you outrun a pandemic. Since 2013, billions of sea stars, also known as starfish, have died due to a mysterious wasting disease. Now, scientists have finally pinpointed the bacteria responsible for the plague, giving hope that conservation and disease-management programs can save these unique ocean creatures. No species was hit harder by the disease than the sunflower sea star, whose population has decreased by a whopping 90 percent since the plague was first noticed.
Sunflower sea stars come in a variety of colors, from reds, yellows, and oranges, to various shades of purple. Unlike sea stars that resemble a traditional five-pointed star, sunflower sea stars have between 16 and 24 limbs, making them look more like sunflowers. They differ from other sea stars on the inside, too. Their skeletons aren’t solid, like most sea stars’. Instead, it’s made of disjointed, bone-like discs, or ossicles. This makes sunflower sea stars extremely flexible, which comes in handy when hiding from predators and while hunting. Though they may not look much like predators, sunflower sea stars are just that. Their arms are covered in eye spots which help them discern light from dark and allow them to locate potential prey. The bottoms of their limbs boast up to 15,000 thin, almost hair-like tube feet, allowing them to crawl across the ocean floor at speeds of up to 3.3 feet per minute. For sea stars, that’s pretty fast! It’s certainly speedy enough to hunt down their favorite prey: other invertebrates. Sunflower sea stars are completely carnivorous, dining on sea urchins, clams, and crustaceans.
These many-armed critters have a large range across the Northeast Pacific Ocean, from the coastal waters of Alaska to Mexico. Unfortunately, in recent years their population has dwindled due to a devastating, worldwide plague of sea star wasting disease. The mysterious illness causes sea stars’ bodies to break out in lesions and completely disintegrate. Since 2013, over five billion sea stars have died from the disease. Now, researchers have finally discovered the cause of the illness: a bacteria called Vibrio pectenicida. With this knowledge, steps can finally be taken to save the sea stars. That might involve breeding sea stars that are immune to the bacteria and then releasing them into the ocean, or feeding wild sea stars probiotics to help them fight the bacteria off. Hopefully, this isn’t the last we see of these sunflowers of the sea.
[Image description: A group of reddish-colored sunflower starfish in shallow water.] Credit & copyright: NPS Digital Asset Management system. Asset ID: 2D7F9806-A3B5-ABD1-9B952DA866AA90E2. Constraints Information: Public domain.Even two dozen limbs can’t help you outrun a pandemic. Since 2013, billions of sea stars, also known as starfish, have died due to a mysterious wasting disease. Now, scientists have finally pinpointed the bacteria responsible for the plague, giving hope that conservation and disease-management programs can save these unique ocean creatures. No species was hit harder by the disease than the sunflower sea star, whose population has decreased by a whopping 90 percent since the plague was first noticed.
Sunflower sea stars come in a variety of colors, from reds, yellows, and oranges, to various shades of purple. Unlike sea stars that resemble a traditional five-pointed star, sunflower sea stars have between 16 and 24 limbs, making them look more like sunflowers. They differ from other sea stars on the inside, too. Their skeletons aren’t solid, like most sea stars’. Instead, it’s made of disjointed, bone-like discs, or ossicles. This makes sunflower sea stars extremely flexible, which comes in handy when hiding from predators and while hunting. Though they may not look much like predators, sunflower sea stars are just that. Their arms are covered in eye spots which help them discern light from dark and allow them to locate potential prey. The bottoms of their limbs boast up to 15,000 thin, almost hair-like tube feet, allowing them to crawl across the ocean floor at speeds of up to 3.3 feet per minute. For sea stars, that’s pretty fast! It’s certainly speedy enough to hunt down their favorite prey: other invertebrates. Sunflower sea stars are completely carnivorous, dining on sea urchins, clams, and crustaceans.
These many-armed critters have a large range across the Northeast Pacific Ocean, from the coastal waters of Alaska to Mexico. Unfortunately, in recent years their population has dwindled due to a devastating, worldwide plague of sea star wasting disease. The mysterious illness causes sea stars’ bodies to break out in lesions and completely disintegrate. Since 2013, over five billion sea stars have died from the disease. Now, researchers have finally discovered the cause of the illness: a bacteria called Vibrio pectenicida. With this knowledge, steps can finally be taken to save the sea stars. That might involve breeding sea stars that are immune to the bacteria and then releasing them into the ocean, or feeding wild sea stars probiotics to help them fight the bacteria off. Hopefully, this isn’t the last we see of these sunflowers of the sea.
[Image description: A group of reddish-colored sunflower starfish in shallow water.] Credit & copyright: NPS Digital Asset Management system. Asset ID: 2D7F9806-A3B5-ABD1-9B952DA866AA90E2. Constraints Information: Public domain. -
FREENutrition Daily Curio #3129Free1 CQ
A camel can take you on all sorts of desert adventures, including the culinary kind. Somalia is currently embracing camel milk on a massive scale, and while they’re taking the lead in modernizing the dairy camel industry, they’re not the only ones who are interested.
Few countries in the world rely on camels as much as Somalia does. Agriculture makes up the lion’s share of their economy, and the beast that bears much of that burden is the camel. However, the nutritional and economic potential of camel milk was mostly overlooked until 2006, when the first commercial camel dairy operation was established in the country. Since then, camel milk has been rising in popularity. One of the main benefits of camel milk is that it’s much lower in lactose than cow milk, making it ideal for those with lactose intolerance. Camel milk also lacks β-lactoglobulin, an allergen present in cow milk that makes it unsuitable for many allergy sufferers. It also contains more vitamin C, iron, and zinc than cow milk, and yogurt made from camel milk still has plenty of probiotics.
Perhaps the most surprising benefit of camel milk is that it might help manage type-1 diabetes. Communities that consume milk from dromedary camels (camels with one hump) apparently have fewer cases of diabetes, and dromedary milk has been shown to lower blood sugar levels in diabetic lab rats. In humans with type-1 diabetes, camel milk has been shown to promote endogenous insulin secretion, though it’s far from a cure. The only downside? Those who are used to the taste of cow’s milk might have a little trouble with its slightly salty taste, which results from higher sodium levels. Sounds like a small hump to get over for so many benefits.
[Image description: A camel walking in a desert with mountains in the background.] Credit & copyright: Bernard Gagnon, Wikimedia Commons. This file is made available under the Creative Commons CC0 1.0 Universal Public Domain Dedication.A camel can take you on all sorts of desert adventures, including the culinary kind. Somalia is currently embracing camel milk on a massive scale, and while they’re taking the lead in modernizing the dairy camel industry, they’re not the only ones who are interested.
Few countries in the world rely on camels as much as Somalia does. Agriculture makes up the lion’s share of their economy, and the beast that bears much of that burden is the camel. However, the nutritional and economic potential of camel milk was mostly overlooked until 2006, when the first commercial camel dairy operation was established in the country. Since then, camel milk has been rising in popularity. One of the main benefits of camel milk is that it’s much lower in lactose than cow milk, making it ideal for those with lactose intolerance. Camel milk also lacks β-lactoglobulin, an allergen present in cow milk that makes it unsuitable for many allergy sufferers. It also contains more vitamin C, iron, and zinc than cow milk, and yogurt made from camel milk still has plenty of probiotics.
Perhaps the most surprising benefit of camel milk is that it might help manage type-1 diabetes. Communities that consume milk from dromedary camels (camels with one hump) apparently have fewer cases of diabetes, and dromedary milk has been shown to lower blood sugar levels in diabetic lab rats. In humans with type-1 diabetes, camel milk has been shown to promote endogenous insulin secretion, though it’s far from a cure. The only downside? Those who are used to the taste of cow’s milk might have a little trouble with its slightly salty taste, which results from higher sodium levels. Sounds like a small hump to get over for so many benefits.
[Image description: A camel walking in a desert with mountains in the background.] Credit & copyright: Bernard Gagnon, Wikimedia Commons. This file is made available under the Creative Commons CC0 1.0 Universal Public Domain Dedication.
August 5, 2025
-
FREEMusic Song CurioFree2 CQ
Make way for a new kind of rock and roll! That’s the sentiment that seemed to precede Pink Floyd’s 1967 performance on American Bandstand. The show, which first aired on this day in 1957, had long been associated with old-timey rock-and-roll, from Chubby Checker to The Supremes. Pink Floyd represented a new era for the program with their psychedelic sound…or they might have, had they performed a different single. Instead, they chose Apples and Oranges, a single that they also performed during their first U.S. tour that same year. Chosen because of its nonaggressive, somewhat tasteful sound, (though there is still plenty of psychedelic guitar sliding) the song didn’t represent the band as well as other tracks might have. The cheery tune featured lyrics about a girl shopping at the supermarket—a far cry from Floyd’s later, more famous works about societal conflict. It was written by then-frontman Syd Barrett, who was replaced soon after Pink Floyd appeared on American Bandstand. In the music world (and in television) things change fast!
Make way for a new kind of rock and roll! That’s the sentiment that seemed to precede Pink Floyd’s 1967 performance on American Bandstand. The show, which first aired on this day in 1957, had long been associated with old-timey rock-and-roll, from Chubby Checker to The Supremes. Pink Floyd represented a new era for the program with their psychedelic sound…or they might have, had they performed a different single. Instead, they chose Apples and Oranges, a single that they also performed during their first U.S. tour that same year. Chosen because of its nonaggressive, somewhat tasteful sound, (though there is still plenty of psychedelic guitar sliding) the song didn’t represent the band as well as other tracks might have. The cheery tune featured lyrics about a girl shopping at the supermarket—a far cry from Floyd’s later, more famous works about societal conflict. It was written by then-frontman Syd Barrett, who was replaced soon after Pink Floyd appeared on American Bandstand. In the music world (and in television) things change fast!
-
FREEMind + Body Daily Curio #3128Free1 CQ
Isn’t it amazing how fast babies grow? Nash Keen, who set the world record for the most premature baby ever born, recently turned one. Nicknamed “Nash Potato” by parents Mollie and Randall Keen, Nash was born in Iowa on July 5, 2024, 19 weeks early, or about 133 days short of the typical full term of 280 days. At birth, he weighed just 10 ounces and measured 9.5 inches long. To put that into perspective, deliveries are considered premature if they occur before 37 weeks, and around one out of every 10 births in the U.S. are premature. They’re more likely to occur if the mother is 35 or older or has chronic health issues like diabetes or heart disease, but many premature births occur for no known medical reason. Although there have been previous cases with similarly short gestational periods (the previous record was 132 days premature), there was no guarantee that Nash would survive even with the best care.
Although premature births are still inherently dangerous, they are much more survivable now than they were before the invention of the first baby incubators in the late 1800s. The incubators were created by French doctor Stéphane Tarnier, who was inspired by egg incubators he had seen at a zoo in Paris. Later, German-American doctor Martin A. Couney popularized the incubators by displaying them to the public at Coney Island. The machines help regulate babies’ body temperature and keep them in a germ-free environment as their organs and immune systems continue to develop. Even so, premature babies require constant care and monitoring in the hospital, and are at greater risk of complications like respiratory distress, apnea of prematurity (paused breathing), anemia, and other issues that arise from not getting the chance to develop longer in the womb. While most premature babies go on to live healthy lives, the risk of chronic health complications goes up the shorter a gestational period is. People who were born prematurely are more likely to suffer from delayed development, depression, anxiety, ADHD, neurological disorders, dental problems, asthma, and hearing loss. While Nash currently requires supplemental oxygen and hearing aids, he seems to be a fairly healthy one-year-old otherwise. Not too bad for this remarkably small bundle of joy.
[Image description: A white figurine of a baby in a cradle.] Credit & copyright: Figure of a Baby in a Cradle Holding a Kitten, c.1830–70. The Metropolitan Museum of Art, Gift of Dr. Charles W. Green, 1947. Public Domain.Isn’t it amazing how fast babies grow? Nash Keen, who set the world record for the most premature baby ever born, recently turned one. Nicknamed “Nash Potato” by parents Mollie and Randall Keen, Nash was born in Iowa on July 5, 2024, 19 weeks early, or about 133 days short of the typical full term of 280 days. At birth, he weighed just 10 ounces and measured 9.5 inches long. To put that into perspective, deliveries are considered premature if they occur before 37 weeks, and around one out of every 10 births in the U.S. are premature. They’re more likely to occur if the mother is 35 or older or has chronic health issues like diabetes or heart disease, but many premature births occur for no known medical reason. Although there have been previous cases with similarly short gestational periods (the previous record was 132 days premature), there was no guarantee that Nash would survive even with the best care.
Although premature births are still inherently dangerous, they are much more survivable now than they were before the invention of the first baby incubators in the late 1800s. The incubators were created by French doctor Stéphane Tarnier, who was inspired by egg incubators he had seen at a zoo in Paris. Later, German-American doctor Martin A. Couney popularized the incubators by displaying them to the public at Coney Island. The machines help regulate babies’ body temperature and keep them in a germ-free environment as their organs and immune systems continue to develop. Even so, premature babies require constant care and monitoring in the hospital, and are at greater risk of complications like respiratory distress, apnea of prematurity (paused breathing), anemia, and other issues that arise from not getting the chance to develop longer in the womb. While most premature babies go on to live healthy lives, the risk of chronic health complications goes up the shorter a gestational period is. People who were born prematurely are more likely to suffer from delayed development, depression, anxiety, ADHD, neurological disorders, dental problems, asthma, and hearing loss. While Nash currently requires supplemental oxygen and hearing aids, he seems to be a fairly healthy one-year-old otherwise. Not too bad for this remarkably small bundle of joy.
[Image description: A white figurine of a baby in a cradle.] Credit & copyright: Figure of a Baby in a Cradle Holding a Kitten, c.1830–70. The Metropolitan Museum of Art, Gift of Dr. Charles W. Green, 1947. Public Domain.
August 4, 2025
-
2 minFREEHumanities Word CurioFree2 CQ
Word of the Day
: August 4, 2025\TAP-uh-stree\ noun
What It Means
A tapestry is a heavy textile characterized by complicated pictorial design...
with Merriam-WebsterWord of the Day
: August 4, 2025\TAP-uh-stree\ noun
What It Means
A tapestry is a heavy textile characterized by complicated pictorial design...
-
FREEUS History Daily Curio #3127Free1 CQ
Even in times of harsh oppression, some people risk everything for freedom. A statue of Robert Smalls, a formerly-enslaved man who famously sailed to freedom, will be placed in the South Carolina state capitol soon, making him the first Black American to receive the honor. Robert Smalls was born into slavery on April 5, 1839, in Beaufort, South Carolina. He worked various jobs in town as an enslaved laborer and was working on a ship called the Planter when the Civil War broke out. When the ship was contracted as a transport ship for the Confederate army, Smalls began piloting it, giving him a skill and experience that soon proved invaluable.
On May 12, 1862, under cover of night, Smalls and his fellow enslaved crewmen commandeered the Planter and escaped the South with the steamer, his family, and several others. After he gave the ship over to the U.S. Navy, Smalls was officially made captain and used the ship he had stolen from the Confederates to navigate around Charleston Harbor, aiding the Union. After the war, Smalls entered politics, earning a seat in the state House of Representatives and then the state Senate. Eventually, he was elected to serve in the U.S. House of Representatives, where he served five terms as a Congressman from 1875 to 1887. Beyond politics, Smalls also became a successful businessman, and even came to own the property of his former enslaver, Henry McKee. Although much of Smalls’s legacy was obscured for decades due to the increasingly discriminatory politics of the South in the early 20th century, his story could never truly be forgotten, and continued to inspire civil rights activists after his lifetime. Even as Jim Crow laws overtook the South, Smalls continued to fight for the rights of Black Americans until he passed away on February 22, 1915. His statue is set to be 12 feet tall, but even that might not quite do justice to such a larger-than-life figure.
[Image description: The front gate of a large, white house. A plaque on the gate reads “Robert Smalls House” with a description beneath.] Credit & copyright: NPS Image Gallery, asset ID:a78b0caf-517d-4369-8919-81e7228dbfaf. Constraints Information: Public domain:Full Granting Rights.Even in times of harsh oppression, some people risk everything for freedom. A statue of Robert Smalls, a formerly-enslaved man who famously sailed to freedom, will be placed in the South Carolina state capitol soon, making him the first Black American to receive the honor. Robert Smalls was born into slavery on April 5, 1839, in Beaufort, South Carolina. He worked various jobs in town as an enslaved laborer and was working on a ship called the Planter when the Civil War broke out. When the ship was contracted as a transport ship for the Confederate army, Smalls began piloting it, giving him a skill and experience that soon proved invaluable.
On May 12, 1862, under cover of night, Smalls and his fellow enslaved crewmen commandeered the Planter and escaped the South with the steamer, his family, and several others. After he gave the ship over to the U.S. Navy, Smalls was officially made captain and used the ship he had stolen from the Confederates to navigate around Charleston Harbor, aiding the Union. After the war, Smalls entered politics, earning a seat in the state House of Representatives and then the state Senate. Eventually, he was elected to serve in the U.S. House of Representatives, where he served five terms as a Congressman from 1875 to 1887. Beyond politics, Smalls also became a successful businessman, and even came to own the property of his former enslaver, Henry McKee. Although much of Smalls’s legacy was obscured for decades due to the increasingly discriminatory politics of the South in the early 20th century, his story could never truly be forgotten, and continued to inspire civil rights activists after his lifetime. Even as Jim Crow laws overtook the South, Smalls continued to fight for the rights of Black Americans until he passed away on February 22, 1915. His statue is set to be 12 feet tall, but even that might not quite do justice to such a larger-than-life figure.
[Image description: The front gate of a large, white house. A plaque on the gate reads “Robert Smalls House” with a description beneath.] Credit & copyright: NPS Image Gallery, asset ID:a78b0caf-517d-4369-8919-81e7228dbfaf. Constraints Information: Public domain:Full Granting Rights. -
9 minFREEWork Business CurioFree5 CQ
The guardians of interest rates at America's central bank chose not to cut interest rates, given the uncertain effects of tariffs and a resilient overall eco...
The guardians of interest rates at America's central bank chose not to cut interest rates, given the uncertain effects of tariffs and a resilient overall eco...
August 3, 2025
-
1 minFREEHumanities Word CurioFree1 CQ
Word of the Day
: August 3, 2025\ih-GREE-juss\ adjective
What It Means
Egregious is a formal word used to describe things that are conspicuou...
with Merriam-WebsterWord of the Day
: August 3, 2025\ih-GREE-juss\ adjective
What It Means
Egregious is a formal word used to describe things that are conspicuou...
-
7 minFREEWork Business CurioFree4 CQ
The Trump administration set August 1 as the deadline for countries to strike new trade agreements with the U.S. Some met the deadline, and others did not. T...
The Trump administration set August 1 as the deadline for countries to strike new trade agreements with the U.S. Some met the deadline, and others did not. T...
-
FREEArt Appreciation PP&T CurioFree1 CQ
Sew beautiful and sew practical. Quilts don’t just make for warm blankets—they can tell a story. Used as a way to pass the time, explore geometric patterns, preserve mementos, or illustrate life events, quilting can be a painstaking process. This often-overlooked art form has existed for centuries all over the world, and every culture has their own unique take on the craft—including the U.S.
A quilt, in its simplest form, is three layers of fabric sewn together to create a blanket or other textile product. Since quilters must sew across the surface of the fabrics to connect the three pieces as one, the result is a pattern formed by the stitch lines themselves. Owing to its broad definition, it’s hard to pinpoint where, exactly, quilting got its start. One of the oldest depictions of quilts dates back to the first Egyptian dynasty, which existed around 5,000 years ago. It’s not a sample of the quilt itself that survived but an ivory carving depicting a pharaoh clad in a quilted mantle. In the U.S., the earliest quilters were English and Dutch settlers. The quilts they made weren’t just for their beds, but were also hung over windows to insulate their homes. Quilts usually had to be made with whatever materials were available, so quilters frequently reused or repurposed old blankets and scraps of fabric. These early American quilts were meant to be purely utilitarian with little to no artistic intent. That changed, though, as commercially produced fabrics became more commonplace and affordable. What was once done out of necessity became an outlet of creative expression, with early American quilters sewing intricate patterns by hand.
This uniquely American style of quilting came not just from European settlers, but from enslaved Black women. These women were often skilled sewists who made bedding for their enslavers but weren’t always provided bedding of their own. So, they used leftover material that would have otherwise been thrown away to make quilts. For most enslaved women, quilting was one of the only ways they could express a sense of identity and community through a tangible medium, and as quilts passed down through the generations, they also served as a record of their families’ histories. Harriet Powers, a woman born into slavery in 1837, is remembered as one of America’s finest folk artists. Her unique quilts utilized applique to tell biblical stories, and she was even able to exhibit some of her work after the Civil War.
Since quilting was almost exclusively done by women, it turned into a kind of matrilineal tradition for both free and enslaved Americans. The knowledge of how to quilt was passed on from mother to daughter and quilts were treasured family heirlooms. Quilting was also a way for women to form bonds within their respective communities. Even after the advent of sewing machines, most quilts were sewn by hand, which was a time-consuming process. As such, women would often come together in “quilting bees,” where they would sew a blanket or other form of quilt together for a member of the community.
In the U.S., quilting became less common from the 20th century onward, as commercially-produced bedding and other textile goods became easier to find, even in rural places. Throughout the 1970s and 1980s, there was a resurgence of interest in quilting, and it’s mostly done today as a hobby. Quilts are also the subject of newfound academic interest, as they're now rightfully seen as important works of folk art. Though they weren’t exactly intended to be, quilts have become the lasting, historical legacies of the women who made them. What they once sewed together to keep their families warm are now historical artifacts that preserve their voices and serve as proof of their struggles. History isn’t always written; sometimes it’s sewn.
[Image description: A portion of a quilt from the Metropolitan Museum of Art. It features colorful, eight-pointed stars against a cream-colored background. There is an orange border with cream-colored diamonds.] Credit & copyright: Star of Lemoyne Quilt, Rebecca Davis, c. 1846. Gift of Mrs. Andrew Galbraith Carey, 1980. The Metropolitan Museum of Art. Public Domain.Sew beautiful and sew practical. Quilts don’t just make for warm blankets—they can tell a story. Used as a way to pass the time, explore geometric patterns, preserve mementos, or illustrate life events, quilting can be a painstaking process. This often-overlooked art form has existed for centuries all over the world, and every culture has their own unique take on the craft—including the U.S.
A quilt, in its simplest form, is three layers of fabric sewn together to create a blanket or other textile product. Since quilters must sew across the surface of the fabrics to connect the three pieces as one, the result is a pattern formed by the stitch lines themselves. Owing to its broad definition, it’s hard to pinpoint where, exactly, quilting got its start. One of the oldest depictions of quilts dates back to the first Egyptian dynasty, which existed around 5,000 years ago. It’s not a sample of the quilt itself that survived but an ivory carving depicting a pharaoh clad in a quilted mantle. In the U.S., the earliest quilters were English and Dutch settlers. The quilts they made weren’t just for their beds, but were also hung over windows to insulate their homes. Quilts usually had to be made with whatever materials were available, so quilters frequently reused or repurposed old blankets and scraps of fabric. These early American quilts were meant to be purely utilitarian with little to no artistic intent. That changed, though, as commercially produced fabrics became more commonplace and affordable. What was once done out of necessity became an outlet of creative expression, with early American quilters sewing intricate patterns by hand.
This uniquely American style of quilting came not just from European settlers, but from enslaved Black women. These women were often skilled sewists who made bedding for their enslavers but weren’t always provided bedding of their own. So, they used leftover material that would have otherwise been thrown away to make quilts. For most enslaved women, quilting was one of the only ways they could express a sense of identity and community through a tangible medium, and as quilts passed down through the generations, they also served as a record of their families’ histories. Harriet Powers, a woman born into slavery in 1837, is remembered as one of America’s finest folk artists. Her unique quilts utilized applique to tell biblical stories, and she was even able to exhibit some of her work after the Civil War.
Since quilting was almost exclusively done by women, it turned into a kind of matrilineal tradition for both free and enslaved Americans. The knowledge of how to quilt was passed on from mother to daughter and quilts were treasured family heirlooms. Quilting was also a way for women to form bonds within their respective communities. Even after the advent of sewing machines, most quilts were sewn by hand, which was a time-consuming process. As such, women would often come together in “quilting bees,” where they would sew a blanket or other form of quilt together for a member of the community.
In the U.S., quilting became less common from the 20th century onward, as commercially-produced bedding and other textile goods became easier to find, even in rural places. Throughout the 1970s and 1980s, there was a resurgence of interest in quilting, and it’s mostly done today as a hobby. Quilts are also the subject of newfound academic interest, as they're now rightfully seen as important works of folk art. Though they weren’t exactly intended to be, quilts have become the lasting, historical legacies of the women who made them. What they once sewed together to keep their families warm are now historical artifacts that preserve their voices and serve as proof of their struggles. History isn’t always written; sometimes it’s sewn.
[Image description: A portion of a quilt from the Metropolitan Museum of Art. It features colorful, eight-pointed stars against a cream-colored background. There is an orange border with cream-colored diamonds.] Credit & copyright: Star of Lemoyne Quilt, Rebecca Davis, c. 1846. Gift of Mrs. Andrew Galbraith Carey, 1980. The Metropolitan Museum of Art. Public Domain.
August 2, 2025
-
2 minFREEHumanities Word CurioFree2 CQ
Word of the Day
: August 2, 2025\PAL-imp-sest\ noun
What It Means
Palimpsest in its original use refers to writing material (such as a parchm...
with Merriam-WebsterWord of the Day
: August 2, 2025\PAL-imp-sest\ noun
What It Means
Palimpsest in its original use refers to writing material (such as a parchm...
-
8 minFREEWork Business CurioFree5 CQ
On Friday, job growth figures from earlier months were revised sharply downward: May's gain was cut from 125,000 to just 19,000, and June's total from 147,00...
On Friday, job growth figures from earlier months were revised sharply downward: May's gain was cut from 125,000 to just 19,000, and June's total from 147,00...
-
FREESports Sporty CurioFree1 CQ
Three’s a crowd, even in baseball. On this day in 1960, the Continental League shuttered its doors without a single game. Yet, it managed to impact baseball in a big way. Former baseball player and manager Branch Rickey, like many in the industry at the time, wanted to expand the major leagues to include more teams. The team owners didn’t, however, for fear of losing talent. Nevertheless, Rickey went ahead with his bold plan of starting a new major league. Called the Continental League, it was announced in 1959 with plans to have its inaugural game in 1961. With a new league in town, owners in the National League and the American League came together to make an offer. If the Continental League would just disband, they would take in four of its teams immediately and add the others gradually. In the end, the American League got the new Washington Senators (the originals moved to Minnesota) and the Los Angeles Angels in 1961, while the Houston Colt .45s (now the Astros) and the New York Mets joined the National League the following season. Thus, while the Continental League came and went without a single game, it still accomplished everything its founder set out to do. What a roundabout way to win at baseball.
Three’s a crowd, even in baseball. On this day in 1960, the Continental League shuttered its doors without a single game. Yet, it managed to impact baseball in a big way. Former baseball player and manager Branch Rickey, like many in the industry at the time, wanted to expand the major leagues to include more teams. The team owners didn’t, however, for fear of losing talent. Nevertheless, Rickey went ahead with his bold plan of starting a new major league. Called the Continental League, it was announced in 1959 with plans to have its inaugural game in 1961. With a new league in town, owners in the National League and the American League came together to make an offer. If the Continental League would just disband, they would take in four of its teams immediately and add the others gradually. In the end, the American League got the new Washington Senators (the originals moved to Minnesota) and the Los Angeles Angels in 1961, while the Houston Colt .45s (now the Astros) and the New York Mets joined the National League the following season. Thus, while the Continental League came and went without a single game, it still accomplished everything its founder set out to do. What a roundabout way to win at baseball.
August 1, 2025
-
7 minFREEWork Business CurioFree4 CQ
From the BBC World Service: President Trump’s long-delayed tariff deadline has finally passed and for countries without a deal, the import taxes are steep — ...
From the BBC World Service: President Trump’s long-delayed tariff deadline has finally passed and for countries without a deal, the import taxes are steep — ...
-
2 minFREEHumanities Word CurioFree2 CQ
Word of the Day
: August 1, 2025\dih-SOH-shee-ayt\ verb
What It Means
To dissociate is to separate oneself from association or union with som...
with Merriam-WebsterWord of the Day
: August 1, 2025\dih-SOH-shee-ayt\ verb
What It Means
To dissociate is to separate oneself from association or union with som...
-
FREEMind + Body Daily CurioFree1 CQ
It’s a sauce, it’s a dip, it’s a spread, and most importantly, it’s delicious. There’s not a whole lot that pesto can’t do, and it’s been doing it for a long, long time. In fact, some form of this Italian staple has been delighting palettes since the rise of ancient Rome.
Pesto is a green paste traditionally made by mixing and grinding seven ingredients together with a mortar and pestle: basil leaves, parmesan cheese, pecorino cheese, extra virgin olive oil, garlic, pine nuts, and salt. It has a light, vegetable-y flavor and can be used as a pasta or pizza sauce, a dip for bread, or a spread on sandwiches.
There is little doubt that pesto originated in what is now Italy. The ancient Roman version of pesto didn’t call for basil and didn’t always include nuts, but it had most of modern pesto's other ingredients, plus vinegar. The paste, which was also made with a mortar and pestle, was called moretum, and a detailed description of it appears in the Appendix Vergiliana by Virgil, a collection of poems published between 70 and 19 B.C.E.
In the Italian region of Liguria, in the city of Genoa, moretum developed into a similar sauce called agliata in the Middle Ages. This version called for walnuts, solidifying nuts as a core component of pesto. Agliata became a staple of Genoan cuisine, and over time herbs like parsley or sage were added to variations of it. Surprisingly, basil didn’t surface as pesto’s main ingredient until the mid-19th century. Once it did, though, basil outperformed the other green herbs and stuck around. Genoa has been celebrated as the birthplace of modern pesto ever since. You could say that their pesto is the best-o.
[Image description: A plate of pasta with spaghetti noodles and pesto sauce.] Credit & copyright: Benoît Prieur (1975–), Wikimedia Commons. This file is made available under the Creative Commons CC0 1.0 Universal Public Domain Dedication.It’s a sauce, it’s a dip, it’s a spread, and most importantly, it’s delicious. There’s not a whole lot that pesto can’t do, and it’s been doing it for a long, long time. In fact, some form of this Italian staple has been delighting palettes since the rise of ancient Rome.
Pesto is a green paste traditionally made by mixing and grinding seven ingredients together with a mortar and pestle: basil leaves, parmesan cheese, pecorino cheese, extra virgin olive oil, garlic, pine nuts, and salt. It has a light, vegetable-y flavor and can be used as a pasta or pizza sauce, a dip for bread, or a spread on sandwiches.
There is little doubt that pesto originated in what is now Italy. The ancient Roman version of pesto didn’t call for basil and didn’t always include nuts, but it had most of modern pesto's other ingredients, plus vinegar. The paste, which was also made with a mortar and pestle, was called moretum, and a detailed description of it appears in the Appendix Vergiliana by Virgil, a collection of poems published between 70 and 19 B.C.E.
In the Italian region of Liguria, in the city of Genoa, moretum developed into a similar sauce called agliata in the Middle Ages. This version called for walnuts, solidifying nuts as a core component of pesto. Agliata became a staple of Genoan cuisine, and over time herbs like parsley or sage were added to variations of it. Surprisingly, basil didn’t surface as pesto’s main ingredient until the mid-19th century. Once it did, though, basil outperformed the other green herbs and stuck around. Genoa has been celebrated as the birthplace of modern pesto ever since. You could say that their pesto is the best-o.
[Image description: A plate of pasta with spaghetti noodles and pesto sauce.] Credit & copyright: Benoît Prieur (1975–), Wikimedia Commons. This file is made available under the Creative Commons CC0 1.0 Universal Public Domain Dedication.