Inflammation as a Cause of Disease

Image Credit: Marco Verch (Flickr)

What is Inflammation? Two Immunologists Explain How the Body Responds to Everything from Stings to Vaccination and Why it Sometimes Goes Wrong

When your body fights off an infection, you develop a fever. If you have arthritis, your joints will hurt. If a bee stings your hand, your hand will swell up and become stiff. These are all manifestations of inflammation occurring in the body.

We are two immunologists who study how the immune system reacts during infections, vaccination and autoimmune diseases where the body starts attacking itself.

This article was republished with permission from The Conversation, a news site dedicated to sharing ideas from academic experts. It represents the research-based findings and thoughts of Prakash Nagarkatti, Professor of Pathology, Microbiology and Immunology, University of South Carolina and Mitzi Nagarkatti Professor of Pathology, Microbiology and Immunology, University of South Carolina

While inflammation is commonly associated with the pain of an injury or the many diseases it can cause, it is an important part of the normal immune response. The problems arise when this normally helpful function overreacts or overstays its welcome.

What is Inflammation?

Generally speaking, the term inflammation refers to all activities of the immune system that occur where the body is trying to fight off potential or real infections, clear toxic molecules or recover from physical injury. There are five classic physical signs of acute inflammation: heat, pain, redness, swelling and loss of function. Low-grade inflammation might not even produce noticeable symptoms, but the underlying cellular process is the same.

Take a bee sting, for example. The immune system is like a military unit with a wide range of tools in its arsenal. After sensing the toxins, bacteria and physical damage from the sting, the immune system deploys various types of immune cells to the site of the sting. These include T cells, B cells, macrophages and neutrophils, among other cells.

The B cells produce antibodies. Those antibodies can kill any bacteria in the wound and neutralize toxins from the sting. Macrophages and neutrophils engulf bacteria and destroy them. T cells don’t produce antibodies, but kill any virus-infected cell to prevent viral spread.

Additionally, these immune cells produce hundreds of types of molecules called cytokines – otherwise known as mediators – that help fight threats and repair harm to the body. But just like in a military attack, inflammation comes with collateral damage.

The mediators that help kill bacteria also kill some healthy cells. Other similar mediating molecules cause blood vessels to leak, leading to accumulation of fluid and influx of more immune cells.

This collateral damage is the reason you develop swelling, redness and pain around a bee sting or after getting a flu shot. Once the immune system clears an infection or foreign invader – whether the toxin in a bee sting or a chemical from the environment – different parts of the inflammatory response take over and help repair the damaged tissue.

After a few days, your body will neutralize the poison from the sting, eliminate any bacteria that got inside and heal any tissue that was harmed.

Asthma is caused by inflammation that leads to swelling and a narrowing of airways in the lungs, as seen in the right cutaway in this image. BruceBlaus/Wikimedia Commons, CC BY-SA

Inflammation as a Cause of Disease

Inflammation is a double-edged sword. It is critical for fighting infections and repairing damaged tissue, but when inflammation occurs for the wrong reasons or becomes chronic, the damage it causes can be harmful.

Allergies, for example, develop when the immune system mistakenly recognizes innocuous substances – like peanuts or pollen – as dangerous. The harm can be minor, like itchy skin, or dangerous if someone’s throat closes up.

Chronic inflammation damages tissues over time and can lead to many noninfectious clinical disorders, including cardiovascular diseases, neurodegenerative disorders, obesity, diabetes and some types of cancers.

The immune system can sometimes mistake one’s own organs and tissues for invaders, leading to inflammation throughout the body or in specific areas. This self-targeted inflammation is what causes the symptoms of autoimmune diseases such as lupus and arthritis.

Another cause of chronic inflammation that researchers like us are currently studying is defects in the mechanisms that curtail inflammation after the body clears an infection.

While inflammation mostly plays out at a cellular level in the body, it is far from a simple mechanism that happens in isolation. Stress, diet and nutrition, as well as genetic and environmental factors, have all been shown to regulate inflammation in some way.

There is still a lot to be learned about what leads to harmful forms of inflammation, but a healthy diet and avoiding stress can go a long way toward helping maintain the delicate balance between a strong immune response and harmful chronic inflammation.

How the Fed’s Balance Sheet Trimming Impacts You

Image: Press conference following November 2022 FOMC meeting – Federal Reserve (Flickr)

Fed Faces Twin Threats of Recession and Financial Crisis as its Inflation Fight Raises Risks of Both

The Fed raising the overnight rate is only half the reason the economy may be driven into a recession and create a financial crisis according to a Mississippi Professor of Finance. He believes the Fed’s interest rate approach, which is most talked about, may create problems, but Professor Blank also points out and defines the Fed’s balance sheet changes and what they could mean for markets, the economy, and the world of finance.

There is wide agreement among economists and market observers that the Federal Reserve’s aggressive interest rate hikes will cause economic growth to grind to a halt, leading to a recession. Less talked about is the risk of a financial crisis as the U.S. central bank simultaneously tries to shrink its massive balance sheet.

As expected, the Fed on Nov. 2, 2022, lifted borrowing costs by 0.75 percentage point – its fourth straight hike of that size, which brings its benchmark rate to as high as 4%.

At the same time as it’s been raising rates, the Fed has been quietly trimming down its balance sheet, which swelled after the COVID-19 pandemic began in 2020. It reached a high of US$9 trillion in April 2022 and has since declined by about $240 billion as the Fed reduces its holdings of Treasury securities and other debt that it bought to avoid an economic meltdown early in the pandemic.

This article was republished with permission from The Conversation, a news site dedicated to sharing ideas from academic experts. It represents the research-based findings and thoughts of D. Brian Blank, Assistant Professor of Finance, Mississippi State University.

As a finance expert, I have been studying financial decisions and markets for over a decade. I’m already seeing signs of distress that could snowball into a financial crisis, compounding the Fed’s woes as it struggles to contain soaring inflation.

Fed Balance Sheet Basics

As part of its mandate, the Federal Reserve maintains a balance sheet, which includes securities, such as bonds, as well as other instruments it uses to pump money into the economy and support financial institutions.

The balance sheet has grown substantially over the last two decades as the Fed began experimenting in 2008 with a policy known as quantitative easing – in essence, printing money – to buy debt to help support financial markets that were in turmoil. The Fed again expanded its balance sheet drastically in 2020 to provide support, or liquidity, to banks and other financial institutions so the financial system didn’t run short on cash. Liquidity refers to the efficiency with which a security can be converted into cash without affecting the price.

But in March 2022, the Fed switched gears. It stopped purchasing new securities and began reducing its holdings of debt in a policy known as quantitative tightening. The current balance is $8.7 trillion, two-thirds of which are Treasury securities issued by the U.S. government.

The result is that there is one less buyer in the $24 trillion treasury market, one of the largest and most important markets in the world. And that means less liquidity.

Loss of Liquidity

Markets work best when there’s plenty of liquidity. But when it dries up, that’s when financial crises happen, with investors having trouble selling securities or other assets. This can lead to a fire sale of financial assets and plunging prices.

Treasury markets have been unusually volatile this year – resulting in the biggest losses in decades – as prices drop and yields shoot up. This is partly due to the Fed rate hikes, but another factor is the sharp loss of liquidity as the central bank pares its balance sheet. A drop in liquidity increases risks for investors, who then demand higher returns for financial assets. This leads to lower prices.

The loss of liquidity not only adds additional uncertainty into markets but could also destabilize financial markets. For example, the most recent quantitative tightening cycle, in 2019, led to a crisis in overnight lending markets, which are used by banks and other financial institutions to lend each other money for very short periods.

Given the sheer size of the Treasury market, problems there are likely to leak into virtually every other market in the world. This could start with money market funds, which are held as low-risk investments for individuals. Since these investments are considered risk-free, any possible risk has substantial consequences – as happened in 2008 and 2020.

Other markets are also directly affected since the Fed holds more than just Treasuries. It also holds mortgages, which means its balance sheet reduction could hurt liquidity in that market too. Quantitative tightening also decreases bank reserves in the financial system, which is another manner in which financial stability could be threatened and increase the risk of a crisis.

The last time the Fed tried to reduce its balance sheet, it caused what was known as a “taper tantrum” as debt investors reacted by selling bonds, causing bond yields to rise sharply, and forced the central bank to reverse course. The long and short of it is that if the Fed continues to reduce its holdings, it could stack a financial crisis on top of a recession, which could lead to unforeseen problems for the U.S. economy – and economies around the globe.

A Two-Front War

For the moment, Fed Chair Jerome Powell has said he believes markets are handling its balance sheet rundown effectively. And on Nov. 2, the Fed said it would continue reducing its balance sheet – to the tune of about $1.1 trillion a year.

Obviously, not everyone agrees, including the U.S. Treasury, which said that the lower liquidity is raising government borrowing costs.

The risks of a major crisis will only grow as the U.S. economy continues to slow as a result of the rate hikes. While the fight against inflation is hard enough, the Fed may soon have a two-front war on its hands.

Blockchain and Web 3 Communities Get More Visibility Into Their Networks

Image Credit: Dejan Krsmanovic (Flickr)

Helping Blockchain Communities Fix Bugs

Zach Winn | MIT News Office

If the crypto enthusiasts are right, the next decade will see billions of people begin using applications built off distributed, user-owned blockchains. The new paradigm has been dubbed Web 3. But Web 3 still has some significant challenges to overcome if it’s going to replace the digital world as we know it.

Blockchain networks, for instance, are going to need an efficient way of detecting and resolving performance problems. Current analytics tools are built for companies to monitor their websites and apps. Such services need only be designed for one user. In the decentralized world of the blockchains, however, the users are the owners, turning the traditional model of maintenance and bug fixes on its head.

The company Metrika, founded by an MIT alumnus, has developed a suite of tools to help the distributed communities of the blockchain world monitor and improve their networks. The company allows users to create alerts, access reports, and view real-time community dashboards that visualize network performance, problems, and trends over time.

“Metrika is a community-based monitoring and collaboration platform,” founder and CEO Nikos Andrikogiannopoulos SM ’06, MBA ’11 says. “We’re making [blockchain network] telemetry a public good for everyone. These applications are holding billions of dollars in assets, so it’s unimaginable that we wouldn’t have service assurance and deep visibility of what is happening in real-time.”

Metrika is currently providing services for popular blockchain protocols including Ethereum, Algorand, Flow, and Solana. The company plans to expand that list as other networks grow in popularity in hopes of enabling the much-hyped shift to Web 3.

“Our vision at Metrika is to become a critical layer of the Web 3 world,” Andrikogiannopoulos says. “Ten years from now, kids will be interacting with assets on their mobile phone. The idea of a bank account will be foreign to them. There will be no corner banks. The whole idea of finance will not go through physical stores and bank accounts — you’ll have assets on every application you use. In that world, where everything is happening on a blockchain, how can Metrika help provide the observability, reliability, and visibility of the blockchain network?”

Bouncing Ideas Off MIT

Andrikogiannopoulos first came to MIT as a graduate student in 2004 and he likes to say he never really left. To this day he lives in Cambridge with his wife, who works at MIT, and returns to campus often.

After earning his second MIT degree, an MBA from the Sloan School of Management, Andrikogiannopoulos began a telecommunications consulting job. During lunch breaks, he’d return to MIT to work with the Venture Mentoring Services (VMS), where entrepreneurs from the MIT community can connect with mentors and receive advice. While kicking around telecommunications startup ideas, a VMS mentor connected him to internet entrepreneur Rubin Gruber, who suggested he explore the blockchain space instead.

It was mid 2018 — what many remember as the “crypto winter” for the lull in blockchain hype and the corresponding crash of crypto prices. But Andrikogiannopoulos began researching the industry and networking with people in the blockchain space, including an MIT alumnus working at the blockchain company Algorand, which was founded by Silvio Micali, the Ford Foundation Professor of Engineering at MIT.

A few months after their initial talk, Andrikogiannopoulos returned to Gruber’s office and told him blockchains were lacking monitoring and operational intelligence.

The problem stems from the decentralized structure of blockchains. Each user operates as a node in the system by creating, receiving, and moving data through their server. When users encounter a problem, they need to figure out if the problem lies within their node or involves the network as a whole.

“They might go on Twitter and Discord and ask other users what they’re experiencing,” Andrikogiannopoulos says. “They’re trying to triangulate the problem, and it takes several hours for them to figure out the issue, coordinate a response, and resolve it.”

To build Metrika, Andrikogiannopoulos set up open-source nodes across the globe that pull data from the nodes and networks, then aggregate those data into easy-to-understand reports and other tools.

“We act as public infrastructure, so users get visibility through dashboards, alerting, and reports, and then we add collaboration tools on top of that,” Andrikogiannopoulos explains.

By 2019, Metrika had begun detecting problems with node performance, staking, network latency, and errors like blocks not being produced at the right rate. Andrikogiannopoulos showed his progress to employees at Algorand, who expressed interest, so he continued building out Metrika’s suite of tools.

“You can see the idea of Metrika bounced across the entire MIT ecosystem,” Andrikogiannopoulos says. “It’s crucial when you start companies that you have these kinds of insight and resource-rich environments like MIT, where you can iterate on your ideas and find team members to join you.”

Enabling Web 3

Blockchains are no longer a niche technology. Around the world, companies in finance and logistics, as well gamers and other creatives, are adopting the technology.

“The blockchain world up to today has been a large experiment,” Andrikogiannopoulos says. “A lot of this infrastructure just hasn’t been built. But Bitcoin proved this can work outside of the traditional finance world, and Ethereum is bringing it to another level with applications, smart contracts, and by creating essentially a decentralized, smart computer. We think about enabling that world we see coming.”

As Metrika continues building out solutions to monitor blockchains, it also wants to offer services for the many applications being built on top of that infrastructure.

“In the future, if a blockchain transaction doesn’t go through and you’re Goldman Sachs or JP Morgan, you need to know why that transaction didn’t go through and what happened,” Andrikogiannopoulos says. “Or if you’re an application playing a game or buying assets and the transactions are lagging, you need to understand why the user experience is being impacted. In Web 3 these things are every important because of the scale and the flow of value we’re talking about.”

For Nikos, improving blockchain performance is not just about optimizing networks. It’s also about helping to usher in the world of open finance and open applications that Web 3 promises.

“We’ve reached 17 hours of outage on blockchain networks in some cases, but what’s even more important to me is not the outages themselves, but the infrastructure needed to avoid them as the industry continues maturing,” Nikos says. “These problems can compromise trust as we’re onboarding users into the Web 3 world. Metrika’s mission is to enable a compelling Web 3 ecosystem.”

$1.8 Billion Cancer “Moonshot” includes MCED Development

Image Credit: Karolina Grawbowska (Pexels)

A Blood Test that Screens for Multiple Cancers at Once Promises to Boost Early Detection

Detecting cancer early before it spreads throughout the body can be lifesaving. This is why doctors recommend regular screening for several common cancer types, using a variety of methods. Colonoscopies, for example, screen for colon cancer, while mammograms screen for breast cancer.

While important, getting all these tests done can be logistically challenging, expensive and sometimes uncomfortable for patients. But what if a single blood test could screen for most common cancer types all at once?

This is the promise of multicancer early detection tests, or MCEDs. This year, President Joe Biden identified developing MCED tests as a priority for the Cancer Moonshot, a US$1.8 billion federal effort to reduce the cancer death rate and improve the quality of life of cancer survivors and those living with cancer.

This article was republished with permission from The Conversation, a news site dedicated to sharing ideas from academic experts. It represents the research-based findings and thoughts of Colin Pritchard, Professor of Laboratory Medicine and Pathology, University of Washington.

As a laboratory medicine physician and researcher who develops molecular tests for cancer, I believe MCED tests are likely to transform cancer screening in the near future, particularly if they receive strong federal support to enable rapid innovation.

How MCED Tests Work

All cells in the body, including tumor cells, shed DNA into the bloodstream when they die. MCED tests look for the trace amounts of tumor DNA in the bloodstream. This circulating “cell-free” DNA contains information about what type of tissue it came from and whether it is normal or cancerous.

Testing to look for circulating tumor DNA in the blood is not new. These liquid biopsies – a fancy way of saying blood tests – are already widely used for patients with advanced-stage cancer. Doctors use these blood tests to look for mutations in the tumor DNA that help guide treatment. Because patients with late-stage cancer tend to have a large amount of tumor DNA circulating in the blood, it’s relatively easy to detect the presence of these genetic changes.

MCED tests are different from existing liquid biopsies because they are trying to detect early-stage cancer, when there aren’t that many tumor cells yet. Detecting these cancer cells can be challenging early on because noncancer cells also shed DNA into the bloodstream. Since most of the circulating DNA in the bloodstream comes from noncancer cells, detecting the presence of a few molecules of cancer DNA is like finding a needle in a haystack.

Making things even more difficult, blood cells shed abnormal DNA naturally with aging, and these strands can be confused for circulating cancer DNA. This phenomenon, known as clonal hematopoiesis, confounded early attempts at developing MCED tests, with too many false positive results.

Fortunately, newer tests are able to avoid blood cell interference by focusing on a type of “molecular barcode” embedded in the cancer DNA that identifies the tissue it came from. These barcodes are a result of DNA methylation, naturally existing modifications to the surface of DNA that vary for each type of tissue in the body. For example, lung tissue has a different DNA methylation pattern than breast tissue. Furthermore, cancer cells have abnormal DNA methylation patterns that correlate with cancer type. By cataloging different DNA methylation patterns, MCED tests can focus on the sections of DNA that distinguish between cancerous and normal tissue and pinpoint the cancer’s origin site.

DNA contains molecular patterns that indicate where in the body it came from. (CNX OpenStax/Wikimedia Commons)

Testing Options

There are currently several MCED tests in development and in clinical trials. No MCED test is currently FDA-approved or recommended by medical societies.

In 2021, the biotech company GRAIL, LLC launched the first commercially available MCED test in the U.S. Its Galleri test claims to detect over 50 different types of cancers. At least two other U.S.-based companies, Exact Sciences and Freenome, and one Chinese company, Singlera Genomics, have tests in development. Some of these tests use different cancer detection methods in addition to circulating tumor DNA, such as looking for cancer-associated proteins in blood.

MCED tests are not yet typically covered by insurance. GRAIL’s Galleri test is currently priced at $949, and the company offers a payment plan for people who have to pay out of pocket. Legislators have introduced a bill in Congress to provide Medicare coverage for MCED tests that obtain FDA approval. It is unusual for Congress to consider legislation devoted to a single lab test, and this highlights both the scale of the medical market for MCED and concerns about disparities in access without coverage for these expensive tests.

How Should MCED Tests be Used?

Figuring out how MCED tests should be implemented in the clinic will take many years. Researchers and clinicians are just beginning to address questions on who should be tested, at what age, and how past medical and family history should be taken into account. Setting guidelines for how doctors will further evaluate positive MCED results is just as important.

There is also concern that MCED tests may result in overdiagnoses of low-risk, asymptomatic cancers better left undetected. This happened with prostate cancer screening. Previously, guidelines recommended that all men ages 55 to 69 regularly get blood tests to determine their levels of PSA, a protein produced by cancerous and noncancerous prostate tissue. But now the recommendation is more nuanced, with screening suggested on an individual basis that takes into account personal preferences.

Another concern is that further testing to confirm positive MCED results will be costly and a burden to the medical system, particularly if a full-body scan is required. The out-of-pocket cost for an MRI, for example, can run up to thousands of dollars. And patients who get a positive MCED result but are unable to confirm the presence of cancer after extensive imaging and other follow-up tests may develop lifelong anxiety about a potentially missed diagnosis and continue to take expensive tests in fruitless search for a tumor.

Despite these concerns, early clinical studies show promise. A 2020 study of over 10,000 previously undiagnosed women found 26 of 134 women with a positive MCED test were confirmed to have cancer. A 2021 study sponsored by GRAIL found that half of the over 2,800 patients with a known cancer diagnosis had a positive MCED test and only 0.5% of people confirmed to not have cancer had a false positive test. The test performed best for patients with more advanced cancers but did detect about 17% of the patients who had very-early-stage disease.

MCED tests may soon revolutionize the way clinicians approach cancer screening. The question is whether the healthcare system is ready for them.

Stem Cell Research is Helping to Understand Reproductive Risks in Space

Image Credit: Karl Schultz

Pregnancy in Space: Studying Stem Cells in Zero Gravity May Determine Whether it’s Safe

Space is a hostile, extreme environment. It’s only a matter of time before ordinary people are exposed to this environment, either by engaging in space tourism or by joining self-sustaining colonies far away from Earth.

To this end, there needs to be a much better understanding of how the environmental dangers of space will affect the biology of our cells, tissues, organs, and cognition. Crucially for future space colonies, we need to know whether we can easily reproduce in environments other than those found on Earth.

The effects of radiation on our cells, producing DNA damage, are well documented. What’s less clear is how lower levels of gravity, what scientists call microgravity, will affect the mechanisms and rhythms taking place within our cells.

Scientists are only just beginning to investigate how activity in our cells might be affected by exposure to microgravity. Crucially, experiments on embryonic stem cells, and models of how embryos develop in their first few weeks in space, will help us determine whether it’s possible for humans to produce offspring in the extraplanetary colonies of the future.

Cosmic Conception

The ability to reproduce in space has been assessed in a few animals, including insects, amphibians, fish, reptiles, birds, and rodents. They have found that it’s certainly possible for organisms such as fish, frogs and geckos to produce fertilised eggs during spaceflight that can live and reproduce on Earth.

But the picture is more complicated in mammals. A study of mice, for instance, found that their oestrous cycle, part of the reproductive cycle, was disrupted by exposure to microgravity. Another study found that exposure to microgravity caused negative neurological alterations in rats. Hypothetically, these effects could also be transmitted to subsequent generations.

This likely happens because our cells did not evolve to work in microgravity. They evolved over millions of years on Earth, in it’s unique gravitational field. Earth’s gravity is part of what anchors and exerts physical force on our tissues, our cells, and our intracellular contents, helping to control specific movements within cells. The study of this is called mechanobiology.

The division of cells and the movement of genes and chromosomes within them, which is crucial to the development of a foetus, also works with and against the force of gravity as we know it on Earth. It follows that systems evolved to work perfectly in Earth’s gravity may be affected when the force of gravity changes.

Fetal Position

When an embryo first starts to divide, in a process called cleavage, the rate of division can be faster at one end of the embryo than the other. Gravity plays a role here, determining the position of the very first building blocks in a human life.

gravity also helps to establish the correct body plan of a fetus, ensuring the right cells develop in the right places in the right numbers and in the right spatial orientation.

Researchers have investigated whether embryonic stem cells, which are “pluripotent” and can develop into all cells of the body, are affected by microgravity. At present, there is some evidence that when rodent embryonic stem cells are subjected to microgravity, their ability to become the desired cell types may be impacted.

It is also possible to produce pluripotent human stem cells from normal mature cells of our bodies, which are called induced pluripotent stem cells. These have also been studied under microgravity, with experiments on Earth finding that induced stem cells proliferate faster in simulated microgravity. Two batches of these stem cells are currently on the International Space Station to see whether these results can be replicated in space.

If adult stem cells do proliferate faster in space, it could open the door for commercial stem cell manufacturers to produce these cells in orbit, seeing as it’s difficult to culture enough stem cells on Earth to treat degenerative diseases with stem cell therapies.

Gravitational Field

Besides normal cellular processes, it’s also unclear how fertilization, hormone production, lactation, and even birth itself will be affected by exposure to microgravity.

It seems that short-term exposure to microgravity, of perhaps half an hour, will probably not have too much of an effect on our cells. But longer exposures of days or weeks are likely to have an effect. This is not taking into account the effect of radiation on our cells and DNA, but we already know how to protect against radiation.

Scientists are looking at two ways to protect against the adverse effects of microgravity on our biology: intervention at the cellular level, using drugs or nanotechnology, and intervention on the environmental level, by simulating Earth’s gravity in spacecraft or off-world colonies. Both fields of study are in their early stages.

Still, studying stem cells in space provides a valuable window into how pregnancy could work, or not work when we’re outside Earth’s gravitational field. For now, those fortunate enough to go to space might do well to avoid attempting to conceive before, during or directly after a space flight.

Less Expensive Batteries Don’t Always Come from Cheaper Materials

Image Credit: 24M Technology (MIT News)

Zach Winn | MIT News Office

When it comes to battery innovations, much attention gets paid to potential new chemistries and materials. Often overlooked is the importance of production processes for bringing down costs.

Now the MIT spinout 24M Technologies has simplified lithium-ion battery production with a new design that requires fewer materials and fewer steps to manufacture each cell. The company says the design, which it calls “SemiSolid” for its use of gooey electrodes, reduces production costs by up to 40 percent. The approach also improves the batteries’ energy density, safety, and recyclability.

Judging by industry interest, 24M is onto something. Since coming out of stealth mode in 2015, 24M has licensed its technology to multinational companies including Volkswagen, Fujifilm, Lucas TVS, Axxiva, and Freyr. Those last three companies are planning to build gigafactories (factories with gigawatt-scale annual production capacity) based on 24M’s technology in India, China, Norway, and the United States.

“The SemiSolid platform has been proven at the scale of hundreds of megawatts being produced for residential energy-storage systems. Now we want to prove it at the gigawatt scale,” says 24M CEO Naoki Ota, whose team includes 24M co-founder, chief scientist, and MIT Professor Yet-Ming Chiang.

Establishing large-scale production lines is only the first phase of 24M’s plan. Another key draw of its battery design is that it can work with different combinations of lithium-ion chemistries. That means 24M’s partners can incorporate better-performing materials down the line without substantially changing manufacturing processes.

The kind of quick, large-scale production of next-generation batteries that 24M hopes to enable could have a dramatic impact on battery adoption across society — from the cost and performance of electric cars to the ability of renewable energy to replace fossil fuels.

“This is a platform technology,” Ota says. “We’re not just a low-cost and high-reliability operator. That’s what we are today, but we can also be competitive with next-generation chemistry. We can use any chemistry in the market without customers changing their supply chains. Other startups are trying to address that issue tomorrow, not today. Our tech can address the issue today and tomorrow.”

A Simplified Design

Chiang, who is MIT’s Kyocera Professor of Materials Science and Engineering, got his first glimpse into large-scale battery production after co-founding another battery company, A123 Systems, in 2001. As that company was preparing to go public in the late 2000s, Chiang began wondering if he could design a battery that would be easier to manufacture.

“I got this window into what battery manufacturing looked like, and what struck me was that even though we pulled it off, it was an incredibly complicated manufacturing process,” Chiang says. “It derived from magnetic tape manufacturing that was adapted to batteries in the late 1980s.”

In his lab at MIT, where he’s been a professor since 1985, Chiang started from scratch with a new kind of device he called a “semi-solid flow battery” that pumps liquids carrying particle-based electrodes to and from tanks to store a charge.

In 2010, Chiang partnered with W. Craig Carter, who is MIT’s POSCO Professor of Materials Science and Engineering, and the two professors supervised a student, Mihai Duduta ’11, who explored flow batteries for his undergraduate thesis. Within a month, Duduta had developed a prototype in Chiang’s lab, and 24M was born. (Duduta was the company’s first hire.)

But even as 24M worked with MIT’s Technology Licensing Office (TLO) to commercialize research done in Chiang’s lab, people in the company including Duduta began rethinking the flow battery concept. An internal cost analysis by Carter, who consulted for 24M for several years, ultimately lead the researchers to change directions.

That left the company with loads of the gooey slurry that made up the electrodes in their flow batteries. A few weeks after Carter’s cost analysis, Duduta, then a senior research scientist at 24M, decided to start using the slurry to assemble batteries by hand, mixing the gooey electrodes directly into the electrolyte. The idea caught on.

The main components of batteries are the positive and negatively charged electrodes and the electrolyte material that allows ions to flow between them. Traditional lithium-ion batteries use solid electrodes separated from the electrolyte by layers of inert plastics and metals, which hold the electrodes in place.

Stripping away the inert materials of traditional batteries and embracing the gooey electrode mix gives 24M’s design a number of advantages.

For one, it eliminates the energy-intensive process of drying and solidifying the electrodes in traditional lithium-ion production. The company says it also reduces the need for more than 80 percent of the inactive materials in traditional batteries, including expensive ones like copper and aluminum. The design also requires no binder and features extra thick electrodes, improving the energy density of the batteries.

“When you start a company, the smart thing to do is to revisit all of your assumptions  and ask what is the best way to accomplish your objectives, which in our case was simply-manufactured, low-cost batteries,” Chiang says. “We decided our real value was in making a lithium-ion suspension that was electrochemically active from the beginning, with electrolyte in it, and you just use the electrolyte as the processing solvent.”

In 2017, 24M participated in the MIT Industrial Liaison Program’s STEX25 Startup Accelerator, in which Chiang and collaborators made critical industry connections that would help it secure early partnerships. 24M has also collaborated with MIT researchers on projects funded by the Department of Energy.

Enabling the Battery Revolution

Most of 24M’s partners are eyeing the rapidly growing electric vehicle (EV) market for their batteries, and the founders believe their technology will accelerate EV adoption. (Battery costs make up 30 to 40 percent of the price of EVs, according to the Institute for Energy Research).

“Lithium-ion batteries have made huge improvements over the years, but even Elon Musk says we need some breakthrough technology,” Ota says, referring to the CEO of EV firm Tesla. “To make EVs more common, we need a production cost breakthrough; we can’t just rely on cost reduction through scaling because we already make a lot of batteries today.”

24M is also working to prove out new battery chemistries that its partners could quickly incorporate into their gigafactories. In January of this year, 24M received a grant from the Department of Energy’s ARPA-E program to develop and scale a high-energy-density battery that uses a lithium metal anode and semi-solid cathode for use in electric aviation.

That project is one of many around the world designed to validate new lithium-ion battery chemistries that could enable a long-sought battery revolution. As 24M continues to foster the creation of large scale, global production lines, the team believes it is well-positioned to turn lab innovations into ubiquitous, world-changing products.

“This technology is a platform, and our vision is to be like Google’s Android [operating system], where other people can build things on our platform,” Ota says. “We want to do that but with hardware. That’s why we’re licensing the technology. Our partners can use the same production lines to get the benefits of new chemistries and approaches. This platform gives everyone more options.”

Reprinted with permission of MIT News  ( http://news.mit.edu/)

Your Genome is Partially Built by Ancient Viruses

Image: Plum Island (USDA – Public Domain)

Humans are 8% Virus – How the Ancient Viral DNA in Your Genome Plays a Role in Human Disease and Development

HERVs, or human endogenous retroviruses, make up around 8% of the human genome, left behind as a result of infections that humanity’s primate ancestors suffered millions of years ago. They became part of the human genome due to how they replicate.

Like modern HIV, these ancient retroviruses had to insert their genetic material into their host’s genome to replicate. Usually this kind of viral genetic material isn’t passed down from generation to generation. But some ancient retroviruses gained the ability to infect germ cells, such as egg or sperm, that do pass their DNA down to future generations. By targeting germ cells, these retroviruses became incorporated into human ancestral genomes over the course of millions of years and may have implications for how researchers screen and test for diseases today.

Active Viral Genes in the Human Genome

Viruses insert their genomes into their hosts in the form of a provirus. There are around 30 different kinds of human endogenous retroviruses in people today, amounting to over 60,000 proviruses in the human genome. They demonstrate the long history of the many pandemics humanity has been subjected to over the course of evolution. Scientists think these viruses once widely infected the population, since they have become fixed in not only the human genome but also in chimpanzee, gorilla and other primate genomes.

This article was republished with permission from The Conversation, a news site dedicated to sharing ideas from academic experts. It represents the research-based findings and thoughts of Seth Blumsack, Professor of Energy and Environmental Economics and International Affairs, Penn State and Aidan Burn, PhD Candidate in Genetics, Tufts University.

Research from our lab and others has demonstrated that HERV genes are active in diseased tissue, such as tumors, as well as during human embryonic development. But how active HERV genes are in healthy tissue was still largely unknown.

To answer this question, our lab decided to focus on one group of HERVs known as HML-2. This group is the most recently active of the HERVs, having gone extinct less than 5 million years ago. Even now, some of its proviruses within the human genome still retain the ability to make viral proteins.

We examined the genetic material in a database containing over 14,000 donated tissue samples from all across the body. We looked for sequences that matched each HML-2 provirus in the genome and found 37 different HML-2 proviruses that were still active. All 54 tissue samples we analyzed had some evidence of activity of one or more of these proviruses. Furthermore, each tissue sample also contained genetic material from at least one provirus that could still produce viral proteins.

The Role of HERVs in Human Health and Disease

The fact that thousands of pieces of ancient viruses still exist in the human genome and can even create protein has drawn a considerable amount of attention from researchers, particularly since related viruses still active today can cause breast cancer and AIDS-like disease in animals.

Whether the genetic remnants of human endogenous retroviruses can cause disease in people is still under study. Researchers have spotted virus-like particles from HML-2 in cancer cells, and the presence of HERV genetic material in diseased tissue has been associated with conditions such as Lou Gehrig’s disease, or amyotrophic lateral sclerosis, as well as multiple sclerosis and even schizophrenia.

Our study adds a new angle to this data by showing that HERV genes are present even in healthy tissue. This means that the presence of HERV RNA may not be enough to connect the virus to a disease.

Importantly, it also means that HERV genes or proteins may no longer be good targets for drugs. HERVs have been explored as a target for a number of potential drugs, including antiretroviral medication, antibodies for breast cancer and T-cell therapies for melanoma. Treatments using HERV genes as a cancer biomarker will also need to take into account their activity in healthy tissue.

On the other hand, our research also suggests that HERVs could even be beneficial to people. The most famous HERV embedded in human and animal genomes, syncytin, is a gene derived from an ancient retrovirus that plays an important role in the formation of the placenta. Pregnancy in all mammals is dependent on the virus-derived protein coded in this gene.

Similarly, mice, cats and sheep also found a way to use endogenous retroviruses to protect themselves against the original ancient virus that created them. While these embedded viral genes are unable to use their host’s machinery to create a full virus, enough of their damaged pieces circulate in the body to interfere with the replication cycle of their ancestral virus if the host encounters it. Scientists theorize that one HERV may have played this protective role in people millions of years ago. Our study highlights a few more HERVs that could have been claimed or co-opted by the human body much more recently for this same purpose.

Unknowns Remain

Our research reveals a level of HERV activity in the human body that was previously unknown, raising as many questions as it answered.

There is still much to learn about the ancient viruses that linger in the human genome, including whether their presence is beneficial and what mechanism drives their activity. Seeing if any of these genes are actually made into proteins will also be important.

Answering these questions could reveal previously unknown functions for these ancient viral genes and better help researchers understand how the human body reacts to evolution alongside these vestiges of ancient pandemics.

New Home Size as a Leading Indicator for Recession

Image Credit: Tannert11 (Flickr)

Housing Is Getting Less Affordable. Governments Are Making It Worse

The average square footage in new single-family houses has been declining since 2015. House sizes tend to fall just during recessionary periods. It happened from 2008 to 2009, from 2001 to 2002, and from 1990 to 1991.

But even with strong economic-growth numbers well into 2019, it looks like demand for houses of historically large size may have finally peaked even before the 2020 recession and our current economic malaise.  (Square footage in new multifamily construction has also increased.)

According to Census Bureau data, the average size of new houses in 2021 was 2,480 square feet. That’s down 7 percent from the 2015 peak of 2,687.

2015’s average, by the way, was an all-time high and represented decades of near-relentless growth in house sizes in the United States since the Second World War. Indeed, in the 48 years from 1973 to 2015, the average size of new houses increased by 62 percent from 1,660 to 2,687 square feet. At the same time, the quality of housing also increased substantially in everything from insulation, to roofing materials, to windows, and to the size and availability of garages.

Meanwhile, the size of American households during this period decreased 16 percent from 3.01 to 2.51 people.

Yet, even with that 7 percent decline in house size since 2015, the average new home in America as of 2021 was still well over 50 percent larger than they were in the 1960s. Home size isn’t exactly falling off a cliff. US homes, on a square-foot-per-person basis, remain quite large by historical standards. Since 1973, square footage per person in new houses has nearly doubled, rising from 503 square feet per person in 1973 to 988 square feet person in 2021. By this measure, new house size actually increased from 2020 to 2021.

This continued drive upward in new home size can be attributed in part to the persistence of easy money over the past decade. Even as homes continued to stay big—and thus stay comparatively expensive—it was not difficult to find buyers for them. Continually falling mortgage rates to historical lows below even 3 percent in many cases meant buyers could simply borrow more money to buy big houses.

But we may have finally hit the wall on home size. In recent months we’re finally starting to see evidence of falling home sales and falling home prices. It’s only now, with mortgage rates surging, inflation soaring, and real wages falling—and thus home price affordability falling—that there are now good reasons for builders to think “wow, maybe we need to build some smaller, less costly homes.”  There are many reasons to think that they won’t, and that for-purchase homes will simply become less affordable. But it’s not the fault of the builders.

This wouldn’t be a problem in a mostly-free market in which builders could easily adjust their products to meet the market where it’s at. In a flexible and generally free market, builders would flock to build homes at a price level at which a large segment of the population could afford to buy those houses.  But that’s not the sort of economy we live in. Rather, real estate and housing development are highly regulated industries at both the federal level and at the local level. Thanks to this, it is becoming more and more difficult for builders to build smaller houses at a time when millions of potential first-time home buyers would gladly snatch them up.

How Government Policy Led to a Codification of Larger, More Expensive Houses

In recent decades, local governments have continued to ratchet up mandates as to how many units can be built per acre, and what size those new houses can be. As The Washington Post reported in 2019, various government regulations and fees, such as “impact fees,” which are the same regardless of the size of the unit, “incentivize developers to build big.” The Post continues, “if zoning allows no more than two units per acre, the incentive will be to build the biggest, most expensive units possible.”

Moreover, community groups opposed to anything that sounds like “density” or “upzoning” will use the power of local governments to crush developer attempts to build more affordable housing. However, as The Post notes, at least one developer has found “where his firm has been able to encourage cities to allow smaller buildings the demand has been strong. For those building small, demand doesn’t seem to be an issue.”

Similarly, in an article last month at The New York Times, Emily Badger notes the central role of government regulations in keeping houses big and ultimately increasingly unaffordable. She writes how in recent decades,

“Land grew more expensive. But communities didn’t respond by allowing housing on smaller pieces of it. They broadly did the opposite, ratcheting up rules that ensured builders couldn’t construct smaller, more affordable homes. They required pricier materials and minimum home sizes. They wanted architectural flourishes, not flat facades. …”

It is true that in many places empty land has increased in price, but in areas where the regulatory burden is relatively low—such as Houston—builders have nonetheless responded with more building of housing such as townhouses.

In many places, however, regulations continue to push up the prices of homes.

Badger notes that in Portland, Oregon, for example, “Permits add $40,000-$50,000. Removing a fir tree 36 inches in diameter costs another $16,000 in fees.” A lack of small “starter homes” is not due to an unwillingness on the part of builders. Governments have simply made smaller home unprofitable.

“You’ve basically regulated me out of anything remotely on the affordable side,” said Justin Wood, the owner of Fish Construction NW.

In Savannah, Ga., Jerry Konter began building three-bed, two-bath, 1,350-square-foot homes in 1977 for $36,500. But he moved upmarket as costs and design mandates pushed him there.

“It’s not that I don’t want to build entry-level homes,” said Mr. Konter, the chairman of the National Association of Home Builders. “It’s that I can’t produce one that I can make a profit on and sell to that potential purchaser.”

Those familiar with how local governments zone land and set building standards will not be surprised by this. Local governments, pressured by local homeowners, will intervene to keep lot sizes large, and to pass ordinances that keep out housing that might be seen by voters as “too dense” or “too cheap-looking.”

Yet, as much as existing homeowners and city planners would love to see nothing but upper middle-class housing with three-car garages along every street, the fact is that not everyone can afford this sort of housing. But that doesn’t mean people in the middle can only afford a shack in a shanty town either — so long as governments will allow more basic housing to be built.

But there are few signs of many local governments relenting on their exclusionary housing policies, and the result has been an ossified housing policy designed to reinforce existing housing, while denying new types of housing that is perhaps more suitable to smaller households and a more stagnant economic environment.

Eventually, though, something has to give. Either governments persist indefinitely with restrictions on “undesirable” housing — which means housing costs skyrocket — or local governments finally start to allow builders to build housing more appropriate to the needs of the middle class.

If current trends continue, we may finally see real pressure to get local governments to allow more building of more affordable single-family homes, or duplexes, or townhouses. If interest rates continue to march upward, this need will become only more urgent. Moreover, as homebuilding materials continue to become more expensive thanks to 40-year highs in inflation—thanks to the Federal Reserve—there will be even more need to find ways to cut regulatory costs in other areas.

For now, the results have been spotty. But where developers are allowed to actually build for a middle-class clientele, it looks like there’s plenty of demand.

About the Author

Ryan McMaken (@ryanmcmaken) is a senior editor at the Mises Institute. Ryan has a bachelor’s degree in economics and a master’s degree in public policy and international relations from the University of Colorado. He is the author of Breaking Away: The Case for Secession, Radical Decentralization, and Smaller Polities (forthcoming) and Commie Cowboys: The Bourgeoisie and the Nation-State in the Western Genre. He was a housing economist for the State of Colorado. 

Ready or Not, Here Come CBDCs

Image Source: usfunds.com

Central Bank Digital Currencies May Be Inevitable, And That’s a Problem

Readers of a certain age will remember Carnac the Magnificent, Johnny Carson’s recurring alter ego. As Carnac, the late-night host would list off three seemingly unrelated words, all of which answered a question that was sealed in an envelope that he held to his forehead.

Today we’re going to play the same game, with the answers being PayPal, Kanye (or Ye, as he’s now known) and central bank digital currencies (CBDCs). And the question: What are the consequences of financial hyper-centralization?

Some of you will make the connections immediately. For everyone else, let me explain.

PayPal, the financial technology (fintech) firm cofounded over 20 years ago by Peter Thiel, Elon Musk and others, was roundly criticized last week after an update to its terms of service showed that the company would fine users $2,500 for, among other things, spreading “misinformation.” A PayPal spokesperson was quick to walk back the update, even claiming that the language “was never intended to be inserted in our policy,” but the damage was done. #DeletePayPal started trending on Twitter, and the company’s stock tanked nearly 12%.

As for Ye, he and his apparel brand Yeezy were reportedly dropped last week by JPMorgan Chase. In a letter widely shared on social media, JPMorgan says Ye has until November 21 to move his business finances elsewhere.

No reason was given by the bank to cut ties with the billionaire rapper, but it’s easy to surmise that Ye was targeted for his political beliefs and outspokenness. I don’t agree with everything he says, nor should you. He’s a controversial figure, and his comments are often erratic and designed to get a rise out of his critics. I’m not sure, though, that this should have anything to do with his access to banking services.

The two cases of PayPal and Ye represent what I believe are legitimate and mounting concerns surrounding centralized finance. Admittedly, Ye is an extreme example. He’s a multiplatinum recording artist with tens of millions of social media followers. But there’s a real fear among everyday people that they too can be fined or have their accounts frozen or canceled at any time for expressing nonconformist views.

This article was republished with permission from Frank Talk, a CEO Blog by Frank Holmes of U.S. Global Investors (GROW). Find more of Frank’s articles here – Originally published October 19, 2022

CBDCs Are Inevitable

That brings me to CBDCs. I was in Europe last week where I attended the Bitcoin Amsterdam conference, and I was honored to participate on a lively panel that was aptly titled “The Specter of CBDCs.”

As I told the audience, I believe CBDCs are inevitable, ready or not. There are too many perceived benefits. These currencies offer broad public access and instant settlements, streamline cross-border payments, preserve the dominance of a nation’s currency and reduce the operational costs of maintaining physical cash. Here in the U.S., millions upon millions of dollars’ worth of bills and coins are lost or accidentally thrown away every year. CBDCs would solve this problem. 

An estimated 90% of the world’s central banks currently have CBDC plans somewhere in the pipeline. As I write this, only two countries have officially launched their own digital currencies—the Bahamas with its Sand Dollar, and Nigeria with its eNaira—but expect many more to follow in the coming years. China, the world’s second largest economy, has been piloting its own CBDC for a couple of years now, and India, the seventh largest, released a report last week laying out the “planned features of the digital Rupee.” A pilot program of the currency is expected to begin “soon.” And speaking at an annual International Monetary Fund (IMF) meeting, Treasury Secretary Janet Yellen said that the U.S. should be “in a position where we could issue” a CBDC.

CBDCs Improve Bitcoin’s Use Case

Due to the centralized nature of CBDCs, however, there are a number of concerns that give many people pause. Unlike Bitcoin, which is decentralized and anonymous, CBDCs raise questions about privacy, government interference and manipulation.

In the White House’s own review of digital currencies, issued last month, policymakers write that a potential U.S. coin system should “promote compliance with” anti-money laundering (AML) and counter-terrorist financing (CFT) laws. Such a system should also “prevent the use of CBDC in ways that violate civil or human rights.” Further, it should be sustainable; that is, it should “minimize energy use, resources use, greenhouse gas emissions, other pollution and environmental impacts on local communities.”

Nothing about this sounds inherently nefarious, but then, some of us may have said the same thing about PayPal’s “misinformation” policy (whether intended or not) and JPMorgan’s decision to end its relationship with a polarizing celebrity.

I believe this only improves Bitcoin’s use case, especially if we’re headed for a digital future.

Worst 60/40 Portfolio Returns In 100 Years

With only a little over 50 trading days left in 2022, it looks more and more likely that this will be among the very worst years in history for investing. Since World War II, there have been only three instances, in 1974, 2002 and 2008, when the S&P 500 ended the year down more than 20%. If 2022 ended today, it would mark only the fourth time.  

Here’s another way to visualize it. The scatter plot below shows annual returns for the S&P 500 (horizontal axis) and U.S. bonds (vertical axis). As you can see, 2022 falls in the most undesirable quadrant along with the years 1931, 1941 and 1969. Not only have stocks been knocked down, but so have bond prices as the Fed continues to hike rates at an historically fast pace.   

What this means is that the traditional “60/40” portfolio—composed of 60% stocks and 40% bonds—now faces its worst year in 100 years, according to Bank of America.

My takeaway is that diversification matters more now than perhaps in any other time in recent memory. Real assets like gold and silver look very attractive right now. Real estate is an option. And Bitcoin continues to trade at a discount. Diversification doesn’t ensure a positive return, but it could potentially spell the difference between losing a little and losing a lot.

You can watch the panel discussion at Bitcoin Amsterdam featuring Frank Holmes by clicking here!

All opinions expressed and data provided are subject to change without notice. Some of these opinions may not be appropriate to every investor. By clicking the link(s) above, you will be directed to a third-party website(s). U.S. Global Investors does not endorse all information supplied by this/these website(s) and is not responsible for its/their content.

The S&P 500 Stock Index is a widely recognized capitalization-weighted index of 500 common stock prices in U.S. companies. Diversification does not protect an investor from market risks and does not assure a profit.

Holdings may change daily. Holdings are reported as of the most recent quarter-end. None of the securities mentioned in the article were held by any accounts managed by U.S. Global Investors as of 9/30/2022.

Regenerative Medicine Takes Aim at Liver Disease

Image Credit: Tareq Salahuddin (Flickr)

Helping the Liver Regenerate Itself Could Give Patients with End-Stage Liver Disease a Treatment Option Besides Waiting for a Transplant

The liver is known for its ability to regenerate. It can completely regrow itself even after two-thirds of its mass has been surgically removed. But damage from medications, alcohol abuse or obesity can eventually cause the liver to fail. Currently, the only effective treatment for end-stage liver disease is transplantation.

However, there is a dearth of organs available for transplantation. Patients may have to wait from 30 days to over five years to receive a liver for transplant in the U.S. Of the over 11,600 patients on the waiting list to receive a liver transplant in 2021, only a little over 9,200 received one.

But what if, instead of liver transplantation, there were a drug that could help the liver regenerate itself?

This article was republished with permission from The Conversation, a news site dedicated to sharing ideas from academic experts. It represents the research-based findings and thoughts of Satdarshan (Paul) Singh Monga, MD, FAASLD, Professor of Pathology and Medicine, University of Pittsburgh Health Sciences.

I am the founding director of the Pittsburgh Liver Research Center and run a lab studying liver regeneration and cancer. In our recently published research, my team and I found that activating a particular protein with a new medication can help accelerate regeneration and repair after severe liver injury or partial surgical removal in mice.

Key Players in Liver Regeneration

The liver performs over 500 key functions in your body, including producing proteins that carry fat through the body, converting excess glucose into glycogen for storage and breaking down toxins like ammonia, among others.

Liver cells, or hepatocytes, take on these many tasks by a divide-and-conquer strategy, also called zonation. This separates the liver into three zones with different tasks, and cells are directed to perform specialized functions by turning on specific genes active in each zone. However, exactly what controls the expression of these genes has been poorly understood.

Over the past two decades, my team and other labs have identified one group of 19 proteins called Wnts that play an important role in controlling liver function and regeneration. While researchers know that Wnt proteins help activate the repair process in damaged liver cells, which ones actually control zonation and regeneration, as well as their exact location in the liver, have been a mystery.

To identify these proteins and where they came from, my team and I used a new technology called molecular cartography to identify how strongly and where 100 liver function genes are active. We found that only two of 19 Wnt genes, Wnt2 and Wnt9b, were functionally present in the liver. We also found that Wnt2 and Wnt9b were located in the endothelial cells lining the blood vessels in zone 3 of the liver, an area that plays a role in a number of metabolic functions.

To our surprise, eliminating these two Wnt genes resulted in all liver cells expressing only genes typically limited to zone 1, significantly limiting the liver’s overall function. This finding suggests that liver cells experience an ongoing push and pull in gene activation that can modify their functions, and Wnt is the master regulator of this process.

Eliminating the two Wnt genes from endothelial cells also completely stopped liver cell division, and thus regeneration, after partial surgical removal of the liver.

Liver Regeneration After Tylenol Overdose

We then decided to test whether a new drug could help recover liver zonation and regeneration. This drug, an antibody called FL6.13, shares similar functions with Wnt proteins, including activating liver regeneration.

Over the course of two days, we gave this drug to mice that were genetically engineered to lack Wnt2 and Wnt9b in their liver endothelial cells. We found that the drug was able to nearly completely recover liver cell division and repair functions.

Lastly, we wanted to test how well this drug worked to repair the liver after Tylenol overdose. Tylenol, or acetaminophen, is an over-the-counter medication commonly used to treat fever and pain. However, an overdose of Tylenol can cause severe liver damage. Without immediate medical attention, it can lead to liver failure and death. Tylenol poisoning is one of the most common causes of severe liver injury requiring liver transplantation in the U.S. Despite this, there is currently only one medication available to treat it, and it is only able to prevent liver damage if taken shortly after overdose.

We tested our new drug on mice with liver damage from toxic doses of Tylenol. We found that one dose was able to decrease liver injury biomarkers – proteins the liver releases when injured – in the blood and reduce liver tissue death. These findings indicate that liver cell repair and tissue regeneration are occurring.

Reducing the Need for Transplantation

One way to address liver transplantation shortages is to improve treatments for liver diseases. While current medications can effectively cure hepatitis C, a viral infection that causes liver inflammation, other liver diseases haven’t seen the same progress. Because very few effective treatments are available for illnesses like nonalcoholic fatty liver disease and alcoholic liver disease, many patients worsen and end up needing a liver transplant.

My team and I believe that improving the liver’s ability to repair itself could help circumvent the need for transplantation. Further study of drugs that promote liver regeneration may help curb the burden of liver disease worldwide.

Medical Device Improves Muscle Rehab Accuracy by 15%

Image Credit: MIT CSAIL

MIT System “Sees” the Inner Structure of the Body During Physical Rehab

Rachel Gordon | MIT CSAIL

A growing number of people are living with conditions that could benefit from physical rehabilitation — but there aren’t enough physical therapists (PTs) to go around. The growing need for PTs is racing alongside population growth, and aging, as well as higher rates of severe ailments, are contributing to the problem.

An upsurge in sensor-based techniques, such as on-body motion sensors, has provided some autonomy and precision for patients who could benefit from robotic systems to supplement human therapists. Still, the minimalist watches and rings that are currently available largely rely on motion data, which lack more holistic data a physical therapist pieces together, including muscle engagement and tension, in addition to movement.

This muscle-motion language barrier recently prompted the creation of an unsupervised physical rehabilitation system, MuscleRehab, by researchers from MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL) and Massachusetts General Hospital. There are three ingredients: motion tracking that captures motion activity, an imaging technique called electrical impedance tomography (EIT) that measures what the muscles are up to, and a virtual reality (VR) headset and tracking suit that lets a patient watch themselves perform alongside a physical therapist.

Patients put on the sleek ninja-esque all-black tracking suit and then perform various exercises such as lunges, knee bends, dead lifts, leg raises, knee extensions, squats, fire hydrants, and bridges that measure activity of quadriceps, sartorius, hamstrings, and abductors. VR captures 3D movement data.

In the virtual environment, patients are given two conditions. In both cases, their avatar performs alongside a physical therapist. In the first situation, just the motion tracking data is overlaid onto their patient avatar. In the second situation, the patient puts on the EIT sensing straps, and then they have all the information of the motion and muscle engagement.

With these two conditions, the team compared the exercise accuracy and handed the results to a professional therapist, who explained which muscle groups were supposed to be engaged during each of the exercises. By visualizing both muscle engagement and motion data during these unsupervised exercises instead of just motion alone, the overall accuracy of exercises improved by 15 percent.

The team then did a cross-comparison of how much time during the exercises the correct muscle group got triggered between the two conditions. In the condition where they show the muscle engagement data in real-time, that’s the feedback. By monitoring and recording the most engagement data, the PTs reported a much better understanding of the quality of the patient’s exercise, and that it helped to better evaluate their current regime and exercise based on those stats.

“We wanted our sensing scenario to not be limited to a clinical setting, to better enable data-driven unsupervised rehabilitation for athletes in injury recovery, patients currently in physical therapy, or those with physical limiting ailments, to ultimately see if we can assist with not only recovery, but perhaps prevention,” says Junyi Zhu, MIT PhD student in electrical engineering and computer science, CSAIL affiliate, and lead author on a new paper about MuscleRehab. “By actively measuring deep muscle engagement, we can observe if the data is abnormal compared to a patient’s baseline, to provide insight into the potential muscle trajectory.”

Current sensing technologies focus mostly on tracking behaviors and heart rates, but Zhu was interested in finding a better way than electromyography (EMG) to sense the engagement (blood flow, stretching, contracting) of different layers of the muscles. EMG only captures muscle activity right beneath the skin, unless it’s done invasively.

Zhu has been digging into the realm of personal health-sensing devices for some time now. He’d been inspired by using EIT, which measures electrical conductivity of muscles, for his project in 2021 that used the noninvasive imaging technique to create a toolkit for designing and fabricating health and motion sensing devices. To his knowledge, EIT, which is usually used for monitoring lung function, detecting chest tumors, and diagnosing pulmonary embolism, hadn’t been done before.

With MuscleRehab, the EIT sensing board serves as the “brains” behind the system. It’s accompanied by two straps filled with electrodes that are slipped onto a user’s upper thigh to capture 3D volumetric data. The motion capturing process uses 39 markers and a number of cameras that sense very high frame rates per second. The EIT sensing data shows actively triggered muscles highlighted on the display, and a given muscle becomes darker with more engagement.

Currently, MuscleRehab focuses on the upper thigh and the major muscle groups inside, but down the line they’d like to expand to the glutes. The team is also exploring potential avenues in using EIT in radiotherapy in collaboration with Piotr Zygmanski, medical physicist at the Brigham and Women’s Hospital and Dana-Farber Cancer Institute and Associate Professor of Radiation at Harvard Medical School.

“We are exploring utilization of electrical fields and currents for detection of radiation as well as for imaging of the of dielectric properties of patient anatomy during radiotherapy treatment, or as a result of the treatment,” says Zygmanski. “Radiation induces currents inside tissues and cells and other media — for instance, detectors — in addition to making direct damage at the molecular level (DNA damage). We have found the EIT instrumentation developed by the MIT team to be particularly suitable for exploring such novel applications of EIT in radiotherapy. We are hoping that with the customization of the electronic parameters of the EIT system we can achieve these goals.”

MuscleRehab Video

“This work advances EIT, a sensing approach conventionally used in clinical settings, with an ingenious and unique combination with virtual reality,” says Yang Zhang, assistant professor in electrical and computer engineering at the UCLA Samueli School of Engineering, who was not involved in the paper. “The enabled application that facilitates rehabilitation potentially has a wide impact across society to help patients conduct physical rehabilitation safely and effectively at home. Such tools to eliminate the need for clinical resources and personnel have long been needed for the lack of workforce in healthcare.”

Reprinted with permission of MIT News” (http://news.mit.edu/)

Quantum Technology’s Use in Encryption and Medical Equipment

Image Credit: Carlos Jones (U.S. Dept. of Energy)

Nobel-Winning Quantum Weirdness Undergirds an Emerging High-Tech Industry, Promising Better Ways of Encrypting Communications and Imaging Your Body

Two quantum particles, like pairs of atoms or photons, can become entangled. That means a property of one particle is linked to a property of the other, and a change to one particle instantly affects the other particle, regardless of how far apart they are. This correlation is a key resource in quantum information technologies.

For the most part, quantum entanglement is still a subject of physics research, but it’s also a component of commercially available technologies, and it plays a starring role in the emerging quantum information processing industry.

Pioneers

The 2022 Nobel Prize in Physics recognized the profound legacy of Alain Aspect of France, John F. Clauser of the U.S. and Austrian Anton Zeilinger’s experimental work with quantum entanglement, which has personally touched me since the start of my graduate school career as a physicist. Anton Zeilinger was a mentor of my Ph.D. mentor, Paul Kwiat, which heavily influenced my dissertation on experimentally understanding decoherence in photonic entanglement.

Decoherence occurs when the environment interacts with a quantum object – in this case a photon – to knock it out of the quantum state of superposition. In superposition, a quantum object is isolated from the environment and exists in a strange blend of two opposite states at the same time, like a coin toss landing as both heads and tails. Superposition is necessary for two or more quantum objects to become entangled.

Entanglement Goes the Distance

Quantum entanglement is a critical element of quantum information processing, and photonic entanglement of the type pioneered by the Nobel laureates is crucial for transmitting quantum information. Quantum entanglement can be used to build large-scale quantum communications networks.

On a path toward long-distance quantum networks, Jian-Wei Pan, one of Zeilinger’s former students, and colleagues demonstrated entanglement distribution to two locations separated by 764 miles (1,203 km) on Earth via satellite transmission. However, direct transmission rates of quantum information are limited due to loss, meaning too many photons get absorbed by matter in transit so not enough reach the destination.

Entanglement is critical for solving this roadblock, through the nascent technology of quantum repeaters. An important milestone for early quantum repeaters, called entanglement swapping, was demonstrated by Zeilinger and colleagues in 1998. Entanglement swapping links one each of two pairs of entangled photons, thereby entangling the two initially independent photons, which can be far apart from each other.

Quantum Protection

Perhaps the most well known quantum communications application is Quantum Key Distribution (QKD), which allows someone to securely distribute encryption keys. If those keys are stored properly, they will be secure, even from future powerful, code-breaking quantum computers.

While the first proposal for QKD did not explicitly require entanglement, an entanglement-based version was subsequently proposed. Shortly after this proposal came the first demonstration of the technique, through the air over a short distance on a table-top. The first demonstrations of entangement-based QKD were published by research groups led by Zeilinger, Kwiat and Nicolas Gisin were published in the same issue of Physical Review Letters in May 2000.

These entanglement-based distributed keys can be used to dramatically improve the security of communications. A first important demonstration along these lines was from the Zeilinger group, which conducted a bank wire transfer in Vienna, Austria, in 2004. In this case, the two halves of the QKD system were located at the headquarters of a large bank and the Vienna City Hall. The optical fibers that carried the photons were installed in the Vienna sewer system and spanned nine-tenths of a mile (1.45 km).

This article was republished with permission from The Conversation, a news site dedicated to sharing ideas from academic experts. It represents the research-based findings and thoughts of Nicholas Peters, Joint Faculty, University of Tennessee.

Entanglement for Sale

Today, there are a handful of companies that have commercialized quantum key distribution technology, including my group’s collaborator Qubitekk, which focuses on an entanglement-based approach to QKD. With a more recent commercial Qubitekk system, my colleagues and I demonstrated secure smart grid communications in Chattanooga, Tennessee.

Quantum communications, computing and sensing technologies are of great interest to the military and intelligence communities. Quantum entanglement also promises to boost medical imaging through optical sensing and high-resolution radio frequency detection, which could also improve GPS positioning. There’s even a company gearing up to offer entanglement-as-a-service by providing customers with network access to entangled qubits for secure communications.

There are many other quantum applications that have been proposed and have yet to be invented that will be enabled by future entangled quantum networks. Quantum computers will perhaps have the most direct impact on society by enabling direct simulation of problems that do not scale well on conventional digital computers. In general, quantum computers produce complex entangled networks when they are operating. These computers could have huge impacts on society, ranging from reducing energy consumption to developing personally tailored medicine.

Finally, entangled quantum sensor networks promise the capability to measure theorized phenomena, such as dark matter, that cannot be seen with today’s conventional technology. The strangeness of quantum mechanics, elucidated through decades of fundamental experimental and theoretical work, has given rise to a new burgeoning global quantum industry.

Will Oil Spike as EU Deadline Approaches?

Image Credit: Tim Reckmann (Flickr)

Russia’s Energy War: Putin’s Unpredictable Actions and Looming Sanctions Could Further Disrupt Oil and Gas Markets

Russia’s effort to conscript 300,000 reservists to counter Ukraine’s military advances in Kharkiv has drawn a lot of attention from military and political analysts. But there’s also a potential energy angle. Energy conflicts between Russia and Europe are escalating and likely could worsen as winter approaches.

One might assume that energy workers, who provide fuel and export revenue that Russia desperately needs, are too valuable to the war effort to be conscripted. So far, banking and information technology workers have received an official nod to stay in their jobs.

The situation for oil and gas workers is murkier, including swirling bits of Russian media disinformation about whether the sector will or won’t be targeted for mobilization. Either way, I expect Russia’s oil and gas operations to be destabilized by the next phase of the war.

The explosions in September 2022 that damaged the Nord Stream 1 and 2 gas pipelines from Russia to Europe, and that may have been sabotage, are just the latest developments in this complex and unstable arena. As an analyst of global energy policy, I expect that more energy cutoffs could be in the cards – either directly ordered by the Kremlin to escalate economic pressure on European governments or as a result of new sabotage, or even because shortages of specialized equipment and trained Russian manpower lead to accidents or stoppages.

Dwindling Natural Gas Flows

Russia has significantly reduced natural gas shipments to Europe in an effort to pressure European nations who are siding with Ukraine. In May 2022, the state-owned energy company Gazprom closed a key pipeline that runs through Belarus and Poland.

In June, the company reduced shipments to Germany via the Nord Stream 1 pipeline, which has a capacity of 170 million cubic meters per day, to only 40 million cubic meters per day. A few months later, Gazprom announced that Nord Stream 1 needed repairs and shut it down completely. Now U.S. and European leaders charge that Russia deliberately damaged the pipeline to further disrupt European energy supplies. The timing of the pipeline explosion coincided with the start up of a major new natural gas pipeline from Norway to Poland.

Russia has very limited alternative export infrastructure that can move Siberian natural gas to other customers, like China, so most of the gas it would normally be selling to Europe cannot be shifted to other markets. Natural gas wells in Siberia may need to be taken out of production, or shut in, in energy-speak, which could free up workers for conscription.

Restricting Russian Oil Profits

Russia’s call-up of reservists also includes workers from companies specifically focused on oil. This has led some seasoned analysts to question whether supply disruptions might spread to oil, either by accident or on purpose.

One potential trigger is the Dec. 5, 2022, deadline for the start of phase six of European Union energy sanctions against Russia. Confusion about the package of restrictions and how they will relate to a cap on what buyers will pay for Russian crude oil has muted market volatility so far. But when the measures go into effect, they could initiate a new spike in oil prices.

Under this sanctions package, Europe will completely stop buying seaborne Russian crude oil. This step isn’t as damaging as it sounds, since many buyers in Europe have already shifted to alternative oil sources.

Before Russia invaded Ukraine, it exported roughly 1.4 million barrels per day of crude oil to Europe by sea, divided between Black Sea and Baltic routes. In recent months, European purchases have fallen below 1 million barrels per day. But Russia has actually been able to increase total flows from Black Sea and Baltic ports by redirecting crude oil exports to China, India and Turkey.

Russia has limited access to tankers, insurance and other services associated with moving oil by ship. Until recently, it acquired such services mainly from Europe. The change means that customers like China, India and Turkey have to transfer some of their purchases of Russian oil at sea from Russian-owned or chartered ships to ships sailing under other nations’ flags, whose services might not be covered by the European bans. This process is common and not always illegal, but often is used to evade sanctions by obscuring where shipments from Russia are ending up.

To compensate for this costly process, Russia is discounting its exports by US$40 per barrel. Observers generally assume that whatever Russian crude oil European buyers relinquish this winter will gradually find alternative outlets.

Where is Russian Oil Going?

The U.S. and its European allies aim to discourage this increased outflow of Russian crude by further limiting Moscow’s access to maritime services, such as tanker chartering, insurance and pilots licensed and trained to handle oil tankers, for any crude oil exports to third parties outside of the G-7 who pay rates above the U.S.-EU price cap. In my view, it will be relatively easy to game this policy and obscure how much Russia’s customers are paying.

On Sept. 9, 2022, the U.S. Treasury Department’s Office of Foreign Assets Control issued new guidance for the Dec. 5 sanctions regime. The policy aims to limit the revenue Russia can earn from its oil while keeping it flowing. It requires that unless buyers of Russian oil can certify that oil cargoes were bought for reduced prices, they will be barred from obtaining European maritime services.

However, this new strategy seems to be failing even before it begins. Denmark is still making Danish pilots available to move tankers through its precarious straits, which are a vital conduit for shipments of Russian crude and refined products. Russia has also found oil tankers that aren’t subject to European oversight to move over a third of the volume that it needs transported, and it will likely obtain more.

Traders have been getting around these sorts of oil sanctions for decades. Tricks of the trade include blending banned oil into other kinds of oil, turning off ship transponders to avoid detection of ship-to-ship transfers, falsifying documentation and delivering oil into and then later out of major storage hubs in remote parts of the globe. This explains why markets have been sanguine about the looming European sanctions deadline.

One Fuel at a Time

But Russian President Vladimir Putin may have other ideas. Putin has already threatened a larger oil cutoff if the G-7 tries to impose its price cap, warning that Europe will be “as frozen as a wolf’s tail,” referencing a Russian fairy tale.

U.S. officials are counting on the idea that Russia won’t want to damage its oil fields by turning off the taps, which in some cases might create long-term field pressurization problems. In my view, this is poor logic for multiple reasons, including Putin’s proclivity to sacrifice Russia’s economic future for geopolitical goals.

Russia managed to easily throttle back oil production when the COVID-19 pandemic destroyed world oil demand temporarily in 2020, and cutoffs of Russian natural gas exports to Europe have already greatly compromised Gazprom’s commercial future. Such actions show that commercial considerations are not a high priority in the Kremlin’s calculus.

How much oil would come off the market if Putin escalates his energy war? It’s an open question. Global oil demand has fallen sharply in recent months amid high prices and recessionary pressures. The potential loss of 1 million barrels per day of Russian crude oil shipments to Europe is unlikely to jack the price of oil back up the way it did initially in February 2022, when demand was still robust.

Speculators are betting that Putin will want to keep oil flowing to everyone else. China’s Russian crude imports surged as high as 2 million barrels per day following the Ukraine invasion, and India and Turkey are buying significant quantities.

Refined products like diesel fuel are due for further EU sanctions in February 2023. Russia supplies close to 40% of Europe’s diesel fuel at present, so that remains a significant economic lever.

The EU appears to know it must kick dependence on Russian energy completely, but its protected, one-product-at-a-time approach keeps Putin potentially in the driver’s seat. In the U.S., local diesel fuel prices are highly influenced by competition for seaborne cargoes from European buyers. So U.S. East Coast importers could also be in for a bumpy winter.

This article was republished with permission from The Conversation, a news site dedicated to sharing ideas from academic experts. It represents the research-based findings and thoughts of Amy Myers Jaffe, Research professor, Fletcher School of Law and Diplomacy, Tufts University.