SpaceJibe

May 1, 2016

Newly Discovered Star Has an Almost Pure Oxygen Atmosphere

Filed under: Cool, Cosmology, Wierd — bferrari @ 5:25 pm
An artist's depiction of white dwarf stars Sirius A and B.

An artist’s depiction of white dwarf stars Sirius A and B.

A newly discovered star is unlike any ever found. With an outermost layer of 99.9 percent pure oxygen, its atmosphere is the most oxygen-rich in the known universe. Heck, it makes Earth’s meager 21 percent look downright suffocating.

The strange stellar oddity is a radically new type of white dwarf star, and was discovered by a team of Brazilian astronomers led by Kepler de Souza Oliveira at the Federal University of Rio Grande do Sul in Brazil. The star is unique in the known pool of 32,000 white dwarf stars, and is the only known star of any kind with an almost pure oxygen atmosphere. The new white dwarf has a mouthful of a name—SDSSJ124043.01+671034.68—but has been nicknamed ‘Dox’ (pronounced Dee-Awks) by Kepler’s team. The discovery was reported today in a paper in the journal Science.

THIS WHITE DWARF WAS INCREDIBLY UNEXPECTED.

“This white dwarf was incredibly unexpected,” says Kepler, “And because we had no idea anything like it could even exist, that made it all the more difficult to find.”

Missing Gas

Here’s a quick refresher: White dwarfs like Dox are the antiques of the cosmos. They’re the hyper-dense husks left over when stars largely sputter out of hydrogen and helium fuel. All but the largest 3 percent of stars end up as white dwarfs. Although Dox is only slightly bigger than our home planet, it’s 60 percent the mass of our sun.

Boris Gänsicke, an astronomer at the University of Warwick, in the UK, who was not involved in Dox’s discovery, confirms that the “exotic white dwarf… has an almost pure oxygen atmosphere, diluted only by traces of neon, magnesium, and silicon,” he writes in an essay accompanying the Science paper. “This chemical composition is unique among known [white dwarfs] and must arise from an extremely rare process.”

So, what makes Dox’s oxygen rich atmosphere so unexpected? Kepler explains that Dox presents more than a couple mysteries. For one, almost all other white dwarfs in the sky have an atmosphere thick with light elements like hydrogen and helium. These light elements are the final dregs of the star’s elemental fusion fuel that survived the star’s earlier life-cycle. Simply because of their weight, these light elements naturally float to the top of white dwarfs.

“What happened to all these light elements?” asks Kepler “How did they all get stripped away?”

Kepler also explains that although traces of heavier elements like carbon and oxygen can be detected in about one out of every five white dwarfs, it’s never quite like this. A white dwarf’s atmosphere is never purely one element, and is often diluted in a pool of lighter elements. Perhaps most perplexing, when oxygen atoms are found, they’re spied in far heavier white dwarfs. Smaller white dwarfs evolve from smaller stars, which don’t fuse together atoms into oxygen as they collapse. By all calculations, Dox would have had to be roughly double its weight to have even forged oxygen atoms in its earlier life. “You have to wonder where this oxygen even came from,” says Kepler.

In short, by simply being so weird, Dox completely defies our general, scientific understanding of how stars evolve and eventually form into white dwarfs. But Kepler suggests that maybe this shouldn’t be all that surprising. That’s because, he argues, scientists have often ignored the wacky results that can come about when stars grow and evolve while locked in a binary dance with other stars—rather than alone.

“I think the main problem is that we [astronomers] have dedicated the last 50 years to calculate the evolution of stars that are not interacting with each other, when at least 30 percent of stars interact with a binary companion,” he says.

Kepler believes Dox looks so strange because of an unlikely binary origin-story. His rough theory goes like this:

At some point Dox may have been a larger white dwarf, locked in a twirling ballet with another star much like our own Sun. These two stars were about the same distance apart as the Sun and Venus are. As Dox’s dance partner started to sputter out of Hydrogen fuel, it formed what’s called a red giant. It expanded rapidly—becoming so big that it actually engulfed the white dwarf in its outermost layer of gas. Kepler believes Dox would have started siphoning off the red giant’s gas onto itself. At some point during that siphoning process, “when it reached a few million degrees, it exploded. That explosion threw all types of matter out. That’s when [Dox] might have lost all its hydrogen and helium. This type of situation is known to have happened with other stars, although it’s never been seen to leave just oxygen,” he says.

The World’s Most Boring Job

Dox was discovered in a data mountain of 4.5 million individual star observations, collected over the last 15 years by a New Mexico observatory in a project called the Sloan Digital Sky Survey. It was found by way of a process so grueling that its initial discoverer—one of Kepler’s undergraduate students Gustavo Ourique—deserves a mention.

Ourique was looking for strange, new types of white dwarfs in a data pile of 300,000 possible observations. These observations are simple graphs about what colors of light came from each pinpoint source (called a spectral graph). Because a computer isn’t easily programmed with such a vague task as “find something weird and cool,” Ourique was challenged with the grunt-work task of physically looking at printed out pages of all 300,000 graphs.

“After a few months he could filter a one or two thousand each day, like reading a book” says Kepler. Yeah, but what a heartbreakingly boring book. That is, at least until it gets thrilling, because after half a year of scanning, and toward the end of the 300,000 graphs, Ourique came across Dox. Because of it’s oxygen atmosphere, Dox’s spectral graph looked truly unique, and he brought it to Kepler.

Ourique, man, you are a hero.

Source

April 25, 2016

Air Force maglev sled breaks record at 633 mph

Filed under: Cool, Government Policies, Military — bferrari @ 8:49 am

1461248353574

In the New Mexico desert last month, a rocket-powered magnetically-levitated sled broke a world record after it blasted down a track at 633 miles per hour, faster than the cruising speed of a 747.

The test occurred at Holloman Air Force Base on a special 2100-foot track on March 4. Air Force video shows the one-ton vehicle rocketing down the track, a fiery, dusty plume behind it.

“We have a magnetically-levitated sled, where we use a very cold liquid helium to essentially levitate the sled via superconducting magnetics,” Lt. Col. Shawn Morgenstern, the commander of the 846th Test Squadron, said in the video.

“The test today was significantly faster than any test that we’ve previously done,” Morgenstern added.

The Air Force said that the sled accelerated at a rate of 928 feet per second. Before this test, the sled had reached 513 mph.

Magnetic levitation systems allow for vehicles to travel in a very low-friction environment, permitting incredibly fast speeds— last year, a Japanese maglev train traveled at 374 mph. And Elon Musk, the CEO of Tesla Motors and SpaceX, has proposed a system called the Hyperloop that would use a related technology to move people or cargo at breathtaking speeds.

Source

April 21, 2016

Why a Chip That’s Bad at Math Can Help Computers Tackle Harder Problems

Filed under: Cool, Gadgets — bferrari @ 1:54 pm

DARPA funded the development of a new computer chip that’s hardwired to make simple mistakes but can help computers understand the world.

Your math teacher lied to you. Sometimes getting your sums wrong is a good thing.

This chip can’t get its arithmetic right, but could make computers more efficient at tricky problems like analyzing images.

This chip can’t get its arithmetic right, but could make computers more efficient at tricky problems like analyzing images.

Why a Chip That’s Bad at Math Can Help Computers Tackle Harder Problems
DARPA funded the development of a new computer chip that’s hardwired to make simple mistakes but can help computers understand the world.
by Tom Simonite April 14, 2016
So says Joseph Bates, cofounder and CEO of Singular Computing, a company whose computer chips are hardwired to be incapable of performing mathematical calculations correctly. Ask it to add 1 and 1 and you will get answers like 2.01 or 1.98.

The Pentagon research agency DARPA funded the creation of Singular’s chip because that fuzziness can be an asset when it comes to some of the hardest problems for computers, such as making sense of video or other messy real-world data. “Just because the hardware is sucky doesn’t mean the software’s result has to be,” says Bates.

A chip that can’t guarantee that every calculation is perfect can still get good results on many problems but needs fewer circuits and burns less energy, he says.

Bates has worked with Sandia National Lab, Carnegie Mellon University, the Office of Naval Research, and MIT on tests that used simulations to show how the S1 chip’s inexact operations might make certain tricky computing tasks more efficient. Problems with data that comes with built-in noise from the real world, or where some approximation is needed, are the best fits. Bates reports promising results for applications such as high-resolution radar imaging, extracting 3-D information from stereo photos, and deep learning, a technique that has delivered a recent burst of progress in artificial intelligence.

In a simulated test using software that tracks objects such as cars in video, Singular’s approach was capable of processing frames almost 100 times faster than a conventional processor restricted to doing correct math—while using less than 2 percent as much power.

Bates is not the first to pursue the idea of using hand-wavy hardware to crunch data more efficiently, a notion known as approximate computing (see “10 Breakthrough Technologies 2008: Probabilistic Chips”). But DARPA’s investment in his chip could give the fuzzy math dream its biggest tryout yet.

Bates is building a batch of error-prone computers that each combine 16 of his chips with a single conventional processor. DARPA will get five such machines sometime this summer and plans to put them online for government and academic researchers to play with. The hope is that they can prove the technology’s potential and lure interest from the chip industry.

DARPA funded Singular’s chip as part of a program called Upside, which is aimed at inventing new, more efficient ways to process video footage. Military drones can collect vast quantities of video, but it can’t always be downloaded during flight, and the computer power needed to process it in the air would be too bulky.

It will take notable feats of software and even cultural engineering for imprecise hardware to take off. It’s not easy for programmers used to the idea that chips are always super-precise to adapt to ones that aren’t, says Christian Enz, a professor at the Swiss Federal Institute of Technology in Lausanne who has built his own approximate computing chips. New tools will be needed to help them do that, he says.

But Deb Roy, a professor at the MIT Media Lab and Twitter’s chief media scientist, says that recent trends in computing suggest approximate computing may find a readier audience than ever. “There’s a natural resonance if you are processing any kind of data that is noisy by nature,” he says. That’s become more and more common as programmers look to extract information from photos and video or have machines make sense of the world and human behavior, he adds.

 

The Curious Link Between the Fly-By Anomaly and the “Impossible” EmDrive Thruster

The same theory that explains the puzzling fly-by anomalies could also explain how the controversial EmDrive produces thrust.

About 10 years ago, a little-known aerospace engineer called Roger Shawyer made an extraordinary claim. Take a truncated cone, he said, bounce microwaves back and forth inside it and the result will be a thrust toward the narrow end of the cone. Voila … a revolutionary thruster capable of sending spacecraft to the planets and beyond. Shawyer called it the EmDrive.

em-drive-640x640

 

 

Shawyer’s announcement was hugely controversial. The system converts one type of energy into kinetic energy, and there are plenty of other systems that do something similar. In that respect it is unremarkable.

The conceptual problems arise with momentum. The system’s total momentum increases as it begins to move. But where does this momentum come from? Shawyer had no convincing explanation, and critics said this was an obvious violation of the law of conservation of momentum.

Shawyer countered with experimental results showing the device worked as he claimed. But his critics were unimpressed. The EmDrive, they said, was equivalent to generating a thrust by standing inside a box and pushing on the sides. In other words, it was snake oil.

Since then, something interesting has happened. Various teams around the world have begun to build their own versions of the EmDrive and put them through their paces. And to everyone’s surprise, they’ve begun to reproduce Shawyer’s results. The EmDrive, it seems, really does produce thrust.
In 2012, a Chinese team said it had measured a thrust produced by its own version of the EmDrive. In 2014, an American scientist built an EmDrive and persuaded NASA to test it with positive results.

And last year, NASA conducted its own tests in a vacuum to rule out movement of air as the origin of the force. NASA, too, confirmed that the EmDrive produces a thrust. In total, six independent experiments have backed Shawyer’s original claims.

That leaves an important puzzle—how to explain the seeming violation of conservation of momentum.

Today we get an answer of sorts thanks to the work of Mike McCulloch at Plymouth University in the U.K. McCulloch’s explanation is based on a new theory of inertia that makes startling predictions about the way objects move under very small accelerations.

First some background. Inertia is the resistance of all massive objects to changes in motion or accelerations. In modern physics, inertia is treated as a fundamental property of massive objects subjected to an acceleration. Indeed, mass can be thought of as a measure of inertia. But why inertia exists at all has puzzled scientists for centuries.

McCulloch’s idea is that inertia arises from an effect predicted by general relativity called Unruh radiation. This is the notion that an accelerating object experiences black body radiation. In other words, the universe warms up when you accelerate.

According to McCulloch, inertia is simply the pressure the Unruh radiation exerts on an accelerating body.

That’s hard to test at the accelerations we normally observe on Earth. But things get interesting when the accelerations involved are smaller and the wavelength of Unruh radiation gets larger.

At very small accelerations, the wavelengths become so large they can no longer fit in the observable universe. When this happens, inertia can take only certain whole-wavelength values and so jumps from one value to the next. In other words, inertia must quantized at small accelerations.

McCulloch says there is observational evidence for this in the form of the famous fly by anomalies. These are the strange jumps in momentum observed in some spacecraft as they fly past Earth toward other planets. That’s exactly what his theory predicts.

Testing this effect more carefully on Earth is hard because the accelerations involved are so small. But one way to make it easier would be to reduce the size of allowed wavelengths of Unruh radiation. “This is what the EmDrive may be doing,” says McCulloch.

The idea is that if photons have an inertial mass, they must experience inertia when they reflect. But the Unruh radiation in this case is tiny. So small in fact that it can interact with its immediate environment. In the case of the EmDrive, this is the truncated cone.

The cone allows Unruh radiation of a certain size at the large end but only a smaller wavelength at the other end. So the inertia of photons inside the cavity must change as they bounce back and forth. And to conserve momentum, this must generate a thrust.

McCulloch puts this theory to the test by using it to predict the forces it must generate. The precise calculations are complex because of the three-dimensional nature of the problem, but his approximate results match the order of magnitude of thrust in all the experiments done so far.

Crucially, McCulloch’s theory makes two testable predictions. The first is that placing a dielectric inside the cavity should enhance the effectiveness of the thruster.

The second is that changing the dimensions of the cavity can reverse the direction of the thrust. That would happen when the Unruh radiation better matches the size of the narrow end than the large end. Changing the frequency of the photons inside the cavity could achieve a similar effect.

McCulloch says there is some evidence that exactly this happens. “This thrust reversal may have been seen in recent NASA experiments,” he says.

That’s an interesting idea. Shawyer’s EmDrive has the potential to revolutionize spaceflight because it requires no propellant, the biggest limiting factor in today’s propulsion systems. But in the absence of any convincing explanation for how it works, scientists and engineers are understandably wary.

McCulloch’s theory could help to change that, although it is hardly a mainstream idea. It makes two challenging assumptions. The first is that photons have inertial mass. The second is that the speed of light must change within the cavity. That won’t be easy for many theorists to stomach.

But as more experimental confirmations of Shawyer’s EmDrive emerge, theorists are being forced into a difficult position. If not McCulloch’s explanation, then what?

Ref: arxiv.org/abs/1604.03449 : Testing Quantized Inertia on the EmDrive

April 19, 2016

Large Hadron Collider results may hint at a new era of physics

Filed under: Big Bang, Black Holes, Cool, Wierd — bferrari @ 2:00 pm
The LHC (Large Hadron Collider) tunnel. (REUTERS/Denis Balibouse)

The LHC (Large Hadron Collider) tunnel. (REUTERS/Denis Balibouse)

Are we about to enter a new era of physics?  Data collected by the Large Hadron Collider in Switzerland may have identified particle activity that doesn’t fit the standard laws of physics.

The analysis by scientists including physicists at the Institute of Nuclear Physics at the Polish Academy of Sciences (IFJ PAN) could have huge scientific implications.

“There are some indications that physicists working at the LHC accelerator at the European Organization for Nuclear Research (CERN) near Geneva may see the first traces of physics beyond the current theory which describes the structure of matter,” said the IFJ PAN, in a recent press release.

The structure of matter is described by a theoretical framework called The Standard Model, which identifies the roles played by different particles. Boson particles, for example, are carriers of forces, whereas photons are related to electromagnetic interactions. Matter is formed by particles called fermions.

However, scientists, analyzing data collected by the LHCb experiment in 2011 and 2012, noticed an anomaly in the decay of a particle called a B Meson. According to the research, the traditional method for determining the particle’s decay may lead to false results.

Related: Science breakthrough? Physicists may have discovered Higgs boson relative

Could the anomaly hint at a new understanding of the Universe? Scientists are certainly intrigued by the anomaly. To put it in terms of the cinema, where we once only had a few leaked scenes from an much-anticipated blockbuster, the LHC has finally treated fans to the first real trailer,” said Professor Mariusz Witek of IFJ PAN, in the release.

Witek notes that the framework used to describe the structure of matter poses plenty of questions for scientists. “The Standard Model cannot explain all the features of the Universe,” he said. “It doesn’t predict the masses of particles or tell us why fermions are organized in three families. How did the dominance of matter over antimatter in the universe come about? What is dark matter? Those questions remain unanswered.”

To further illustrate his point, the Professor notes that gravity isn’t even included in the Standard Model.

However, scientists caution that more research is needed on the B Meson anomaly. “We can’t call it a discovery. Not yet,” said the IFJ PAN.

CERN spokesman Arnaud Marsollier told FoxNews.com that the B Meson data, which first emerged last year, are not conclusive. “More data are needed before we can tell anything significant on this, so we will have to wait for the LHC to restart (soon),” he explained via email, noting the importance of patience when recording and analyzing data. “Science needs time!” he added.

Related: Revamped Large Hadron Collider set to restart

CERN is currently starting powering tests on the huge particle accelerator. “Beams should be back by the end of the month or early April, and collisions sometimes next month if everything goes as planned,” said Marsollier.

Oxford University Physics Professor Guy Wilkinson, who serves as the spokesman for the LHCb experiment, told FoxNews.com that CERN’s B Meson data is “extremely interesting,” but noted that it could be a couple of years before scientists perform a full analysis. “When we analyse this new sample in a year or two we will be able to make a fresh and, I hope, more categorical statement on this topic,” he explained, via email.

The 17-mile LHC was built between 1998 and 2008 to help scientists test some theories of particle and high-energy physics and advance understanding of physical laws.

In 2012 the Collider won global acclaim with the discovery of the long-sought Higgs boson  particle, which explains the behavior of other particles. Physicists Peter Higgs and Francois Englert were subsequently awarded the 2013 Nobel Prize in Physics.

Source

April 12, 2016

A Visionary Project Aims for Alpha Centauri, a Star 4.37 Light-Years Away

Filed under: Uncategorized — bferrari @ 1:49 pm

Can you fly an iPhone to the stars?

In an attempt to leapfrog the planets and vault into the interstellar age, a bevy of scientists and other luminaries from Silicon Valley and beyond, led by Yuri Milner, the Russian philanthropist and Internet entrepreneur, announced a plan on Tuesday to send a fleet of robots no bigger than iPhones to Alpha Centauri, the nearest star system, 4.37 light-years away.

If it all worked out — a cosmically big “if” that would occur decades and perhaps $10 billion from now — a rocket would deliver a “mother ship” carrying a thousand or so small probes to space. Once in orbit, the probes would unfold thin sails and then, propelled by powerful laser beams fromEarth, set off one by one like a flock of migrating butterflies across the universe.

Within two minutes, the probes would be more than 600,000 miles from home — as far as the lasers can maintain a tight beam — and moving at a fifth of the speed of light. But it would still take 20 years for them to get to Alpha Centauri. Those that survived would zip past the stars, making measurements and beaming pictures back to Earth.

Much of this plan is probably half a lifetime away. Mr. Milner and his colleagues estimate that it could take 20 years to get the mission off the ground and into the heavens, 20 years to get to Alpha Centauri and another four years for the word from outer space to come home. And there is still the matter of attracting billions of dollars to pay for it.

Alpha Centauri, the closest star system to Earth’s solar system. An effort led by the billionaire Yuri Milner aims to send a fleet of small probes there. (European Southern Observatory)

Alpha Centauri, the closest star system to Earth’s solar system. An effort led by the billionaire Yuri Milner aims to send a fleet of small probes there. (European Southern Observatory)

“I think you and I will be happy to see the launch,” Mr. Milner, 54, said in an interview, adding that progress in medicine and longevity would determine whether he would live to see the results.

“We came to the conclusion it can be done: interstellar travel,” Mr. Milner said. He announced the project, called Breakthrough Starshot, in a news conference in New York on Tuesday, 55 years after Yuri Gagarin — for whom Mr. Milner is named — became the first human in space.

In a statement released by Breakthrough Starshot, the English cosmologist and author Stephen Hawking said: “Earth is a beautiful place, but it might not last forever. Sooner or later we must look to the stars.”

Dr. Hawking is one of three members of the board of directors for the mission, along with Mr. Milner and Mark Zuckerberg, the Facebook founder.

The project will be directed by Pete Worden, a former director of NASA’s Ames Research Center and the chairman of the Breakthrough Prize Foundation, which Mr. Milner founded and of which the new venture is an offshoot. He has a prominent cast of advisers, including the Harvard astronomer Avi Loeb as chairman; the British astronomer royal Martin Rees; the Nobel Prize-winning astronomer Saul Perlmutter, of the University of California, Berkeley; Ann Druyan, producer of the TV show “Cosmos” and widow of Carl Sagan; and the mathematician and author Freeman Dyson, of the Institute for Advanced Study in Princeton, N.J.

“There are about 20 key challenges we are asking the world’s scientific experts to help us with — and we are willing to financially support their work,” Dr. Worden said in an email.

A detailed technical description of the project will appear on the project’s website.

Estimating that the project could cost $5 billion to $10 billion, Mr. Milner is initially investing $100 million for research and development. He said he was hoping to lure other investors, especially from international sources.

Most of that money would go toward a giant laser array, which could be used to repeatedly send probes toward any star (as long as the senders were not looking for return mail anytime soon) or around the solar system, perhaps to fly through the ice plumes of Saturn’s moon Enceladus, which might contain microbes — tiny forms of life.

In a sense, the start of this space project reflects the make-it-break-it mode of Silicon Valley. Rather than send one big, expensive spacecraft on a journey of years, send thousands of cheap ones. If some break or collide with space junk, others can take their place.

Interstellar travel is a daunting and humbling notion. Alpha Centauri is an alluring target for such a trip: It is the closest star system to our own, and there might be planets in the system. The system consists of three stars: Alpha Centauri A and Alpha Centauri B, sunlike stars that circle each other, and Proxima Centauri, which may be circling the other two. In recent years, astronomers have amassed data suggesting the possibility of an Earth-size planet orbiting Alpha Centauri B.

It would take Voyager 1, humanity’s most distant space probe, more than 70,000 years to reach Alpha Centauri if it were headed in that direction, which it is not.

Over the years, a variety of propulsion schemes have been hatched to cross the void more quickly. In 1962, shortly after lasers had been invented, Robert Forward, a physicist and science fiction author, suggested that they could be used to push sails in space.

In 2011, Darpa, the Defense Advanced Research Projects Agency, got into the act with 100 Year Starship, a contest to develop a business plan for interstellar travel.

By all accounts, Mr. Milner was initially skeptical of an interstellar probe.

But three trends seemingly unrelated to space travel — advances in nanotechnology and lasers and the relentless march of Moore’s Law, making circuits ever smaller and more powerful — have converged in a surprising way.

It is now possible to fit the entire probe with computers, cameras and electrical power, a package with a mass of only one gram, a thirtieth of an ounce.

That, Dr. Loeb said, is about what the guts of an iPhone, stripped of its packaging and displays, amount to.

April 11, 2016

NASA races to save planet-hunting Kepler spacecraft

Filed under: Uncategorized — bferrari @ 10:29 am

Image:  NASA's Kepler spacecraftNASA is trying to resuscitate its planet-hunting Kepler spacecraft, in a state of emergency nearly 75 million miles away.

The treasured spacecraft — responsible for detecting nearly 5,000 planets outside our solar system — slipped into emergency mode sometime last week. The last regular contact was April 4; everything seemed normal then.

Ground controllers discovered the problem Thursday, right before they were going to point Kepler toward the center of the Milky Way as part of a new kind of planetary survey. Kepler was going to join ground observatories in surveying millions of stars in the heart of our galaxy, in hopes of finding planets far from their suns, like our own outer planets, as well as stray planets that might be wandering between stars.

This is the latest crisis in the life of Kepler.

Launched in 2009, the spacecraft completed its primary mission in 2012. Despite repeated breakdowns, Kepler kept going on an extended mission dubbed K2 — until now. The vast 75 million-mile distance between Kepler and Earth make it all the harder to fix.

“Even at the speed of light, it takes 13 minutes for a signal to travel to the spacecraft and back,” mission manager Charlie Sobeck said in a weekend web update from NASA’s Ames Research Center in Mountain View, Calif.

Recovering from this emergency condition “is the team’s priority at this time,” Sobeck said.

More than 1,000 of Kepler’s detected 5,000 exoplanets have been confirmed to date, according to NASA.

Kepler is named after the 17th century German astronomer and mathematician Johannes Kepler.

Source

February 26, 2016

Explaining EmDrive, the ‘physics-defying’ thruster even NASA is puzzled over

roger-shawyer-satellite-propulsion-research-ltd

Even if you don’t keep up with developments in space propulsion technology, you’ve still probably heard about the EmDrive. You’ve probably seen headlines declaring it the key to interstellar travel, and claims that it will drastically reduce travel time across our solar system, making our dreams of people walking on other planets even more of a reality. There have even been claims that this highly controversial technology is the key to creating warp drives.

These are bold claims, and as the great cosmologist and astrophysicist Carl Sagan once said, “extraordinary claims require extraordinary evidence.” With that in mind, we thought it’d be helpful to break down what we know about the enigmatic EmDrive, and whether it is, in fact, the key to mankind exploring the stars.

So without further ado, here’s absolutely everything you need to know about the world’s most puzzling propulsion device.

 

What is the EmDrive?

See, the EmDrive is a conundrum. First designed in 2001 by aerospace engineer Roger Shawyer, the technology can be summed up as a propellantless propulsion system, meaning the engine doesn’t use fuel to cause a reaction. Removing the need for fuel makes a craft substantially lighter, and therefore easier to move (and cheaper to make, theoretically). In addition, the hypothetical drive is able to reach extremely high speeds — we’re talking potentially getting humans to the outer reaches of the solar system in a matter of months.

We’re talking potentially getting humans to the outer reaches of the solar system in a matter of months. The issue is, the entire concept of a reactionless drive is inconsistent with Newton’s conservation of momentum, which states that within a closed system, linear and angular momentum remain constant regardless of any changes that take place within said system. More plainly: Unless an outside force is applied, an object will not move.

 

Reactionless drives are named as such because they lack the “reaction” defined in Newton’s third law: “For every action there is an equal and opposite reaction.” But this goes against our current fundamental understanding of physics: An action (propulsion of a craft) taking place without a reaction (ignition of fuel and expulsion of mass) should be impossible. For such a thing to occur, it would mean an as-yet-undefined phenomenon is taking place — or our understanding of physics is completely wrong.

How does the EmDrive “work?”

Setting aside the potentially physics-breaking improbabilities of the technology, let’s break down in simple terms how the proposed drive operates. The EmDrive is what is called an RF resonant cavity thruster, and is one of several hypothetical machines that use this model. These designs work by having a magnetron push microwaves into a closed truncated cone, then push against the short end of the cone, and propel the craft forward.

This is in contrast to the form of propulsion current spacecraft use, which burn large quantities of fuel to expel a massive amount of energy and mass to rocket the craft into the air. An often-used metaphor for the inefficacy of this is to compare the particles pushing against the enclosure and producing thrust to the act of sitting in a car and pushing a steering wheel to move the car forward.

While tests have been done on experimental versions of the drive — with low energy inputs resulting in a few micronewtons of thrust (about as much force as the weight of a penny) — none of the findings have ever been published in a peer-reviewed journal. That means that any and all purportedly positive test results, and the claims of those who have a vested interest in the technology, should be taken with a very big grain of skepticism-flavored salt. It’s likely that the thrust recorded was due to interference or an unaccounted error with equipment.

Until the tests have been verified through the proper scientific and peer-reviewed processes, one can assume the drive does not yet work. Still, it’s interesting to note the number of people who have tested the drive and reported achieving thrust:

  • In 2001, Shawyer was given a £45,000 grant from the British government to test the EmDrive. His test reportedly achieved 0.016 Newtons of force and required 850 watts of power, but no peer review of the tests verified this. It’s worth noting, however, that this number was low enough that it was potentially an experimental error.
  • In 2008, Yang Juan and a team of Chinese researches at the Northwestern Polytechnical University allegedly verified the theory behind RF resonant cavity thrusters, and subsequently built their own version in 2010, testing the drive multiple times from 2012 to 2014. Tests results were purportedly positive, achieving up yo 750 mN (millinewtons) of thrust, and requiring 2,500 watts of power.
  • In 2014, NASA researchers, tested their own version of an EmDrive, including in a hard vacuum. Once again, the group reported thrust (about 1/1,000 of Shawyer’s claims), and once again, the data was never published through peer-reviewed sources. Other NASA groups are skeptical of researchers’ claims, but in their paper, it is clearly stated that these findings neither confirm nor refute the drive, instead calling for further tests.
  • In 2015, that same NASA group tested a version of chemical engineer Guido Fetta’s Cannae Drive (née Q Drive), and reported positive net thrust. Similarly, a research group at Dresden University of Technology also tested the drive, again reporting thrust, both predicted and unexpected.
  • Yet another test by a NASA research group, Eagleworks, in late 2015 seemingly confirmed the validity of the EmDrive. The test corrected errors that had occurred in the previous tests, and surprisingly, the drive achieved thrust. However, the group has not yet submitted their findings for peer review. It’s possible that other unforeseen errors in the experiment may have cause thrust (the most likely of which is that the vacuum was compromised, causing heat to expand air within it testing environment and move the drive). Whether the findings are ultimately published or not, more tests need to be done. That’s exactly what Glenn Research Center in Cleveland, Ohio, NASA’s Jet Propulsion Laboratory, and Johns Hopkins University Applied Physics Laboratory intend to do. For EmDrive believers, there seems to be some hope.

Implications of a working EmDrive

It’s easy to see how many in the scientific community are wary of EmDrive and RF resonant cavity thrusts altogether. But on the other hand, the wealth of studies raises a few questions: Why is there such a interest in the technology, and why do so many people wish to test it? What exactly are the claims being made about the drive that make it such an attractive idea? While everything from atmospheric temperature-controlling satellites, to safer and more efficient automobiles have been drummed up as potential applications for the drive, the real draw of the technology — and the impetus for its creation in the first place — is the implications for space travel.

em-drive-640x640

Spacecraft equipped with a reactionless drive could potentially make it to the moon in just a few hours, Mars in two to three months, and Pluto within two years. These are extremely bold claims, but if the EmDrive does turn out to be a legitimate technology, they may not be all that outlandish. And with no need to pack several tons-worth of fuel, spacecraft become cheaper and easier to produce, and far lighter.
For NASA and other such organizations, including the numerous private space corporations like SpaceX, lightweight, affordable spacecraft that can travel to remote parts of space fast are something of a unicorn. Still, for that to become a reality, the science has to add up.

Shawyer is adamant that there is no need for pseudoscience or quantum theories to explain how EmDrive works. Instead, he believes that current models of Newtonian physics offer an explanation, and has written papers on the subject, one of which is currently being peer reviewed. He expects the paper to be published sometime this year. While in the past Shawyer has been criticized by other scientists for incorrect and inconsistent science, if the paper does indeed get published, it may begin to legitimize the EmDrive and spur more testing and research.

Spacecraft equipped with a reactionless drive could potentially make it to the Moon in just a few hours.

Despite his insistence that the drive behaves within the laws of physics, it hasn’t prevented him from making bold assertions regarding EmDrive. Shawyer has gone on record saying that this new drive produced warp bubbles which allow the drive to move, claiming that this is how NASA’s test results were likely achieved. Assertions such as these have garnered much interest online, but have no clear supporting data and will (at the very least) require extensive testing and debate in order to be taken seriously by the scientific community — the majority of which remain skeptical of Shawyer’s claims.

Colin Johnston of the Armagh Planetarium wrote an extensive critique of the EmDrive and the inconclusive findings of numerous tests. Similarly, Corey S. Powell of Discovery wrote his own indictment of both Shawyer’s EmDrive and Fetta’s Cannae Drive, as well as the recent fervor over NASA’s findings. Both point out the need for greater discretion when reporting on such instances. Professor and mathematical physicist, John C. Baez expressed his exhaustion at the conceptual technology’s persistence in debates and discussions, calling the entire notion of a reactionless drive “baloney.” His impassioned dismissal echoes the sentiments of many others.

Shawyer’s EmDrive has been met with enthusiasm elsewhere, including the website NASASpaceFlight.com — where information about the most recent Eagleworks’ tests was first posted — and the popular journal New Scientist, which published a favorable and optimistic paper on EmDrive. (The editors later issued a statement that, despite enduring excitement over the idea, they should have shown more tact when writing on the controversial subject.)

Clearly, the EmDrive and RF resonant cavity thruster technology have a lot to prove. There’s no denying that the technology is exciting, and that the number of “successful” tests are interesting, but one must keep in mind the physics preventing the EmDrive from gaining any traction, and the rather curious lack of peer-reviewed studies done on the subject. If the EmDrive is so groundbreaking (and works), surely people like Shawyer would be clamoring for peer-reviewed verification.

A demonstrably working EmDrive could open up exciting possibilities for both space and terrestrial travel — not to mention call into question our entire understanding of physics. However, until that comes to pass, it will remain nothing more than science fiction.

Read more: http://www.digitaltrends.com/cool-tech/emdrive-news-rumors/#ixzz41JSPv7jZ
Follow us: @digitaltrends on Twitter | digitaltrendsftw on Facebook

February 11, 2016

Scientists find evidence of gravitational waves predicted by Einstein

Filed under: Big Bang, Black Holes, Cool, Cosmology, Gamma Ray Bursts — bferrari @ 11:54 am
File image - An image from a simulation showing how matter might be moved around in the extreme environment around a black hole. (Özel/Chan) (Özel/Chan)

File image – An image from a simulation showing how matter might be moved around in the extreme environment around a black hole. (Özel/Chan) (Özel/Chan)

After decades of searching, scientists announced Thursday that they have detected gravitational waves which are ripples in the fabric of space-time that were predicted by Einstein.

 

An international team of astrophysicist said that they detected the waves from the distant crash of two black holes, using a $1.1 billion instrument. The Ligo Collaboration was behind the discovery and it has been accepted for publication in the journal Physical Review Letters.

Related: Meteorite probably didn’t kill man in India, NASA says

“We have detected gravitational waves,” Caltech’s David H. Reitze, executive director of the LIGO Laboratory, told journalists at a news conference in Washington DC.

The news, according to the Associated Press, is being compared by at least one theorist to Galileo taking up a telescope and looking at the planets and the biggest discovery since the discovery of the Higgs particle. It has stunned the world of physics and astronomy, prompting scientists to say it the beginning of a new era in physics that could lead to scores more astrophysical discoveries and the exploration of the warped side of the universe.

“Our observation of gravitational waves accomplishes an ambitious goal set out over five decades ago to directly detect this elusive phenomenon and better understand the universe, and, fittingly, fulfills Einstein’s legacy on the 100th anniversary of his general theory of relativity,” Reitze said in a statement.

Related: Hundreds of hidden galaxies glimpsed behind Milky Way

The discovery confirms a major prediction of Albert Einstein’s 1915 general theory of relativity. Gravitation waves carry information about their dramatic origins and about the nature of gravity that cannot be obtained from elsewhere.

Not only have they fascinated by scientist by found their way into pop culture – namely through movies such as “Back To The Future,” where the space-time continuum was used a medium for the DeLorean time machine to go back in time. It also featured in the “Terminator” series.

Their existence was first demonstrated in the 1970s and 1980s by Joseph Taylor, Jr., and colleagues. In 1974, Taylor and Russell Hulse discovered a binary system composed of a pulsar in orbit around a neutron star. Taylor and Joel M. Weisberg in 1982 found that the orbit of the pulsar was slowly shrinking over time because of the release of energy in the form of gravitational waves. For discovering the pulsar and showing that it would make possible this particular gravitational wave measurement, Hulse and Taylor were awarded the 1993 Nobel Prize in Physics.

Related: White House proposes $19 billion NASA budget

In the latest breakthrough, the gravitational waves were detected on Sept. 14, 2015 by both of the twin Laser Interferometer Gravitational-wave Observatory (LIGO) detectors, located in Livingston, Louisiana, and Hanford, Washington.

Based on the observed signals, LIGO scientists estimate that the black holes for this event were about 29 and 36 times the mass of the sun, and the event took place 1.3 billion years ago. About three times the mass of the Sun was converted into gravitational waves in a fraction of a second — with a peak power output about 50 times that of the whole visible universe.

By looking at the time of arrival of the signals — the detector in Livingston recorded the event 7 milliseconds before the detector in Hanford — scientists can say that the source was located in the Southern Hemisphere.

Related: New star puts on a show in stunning image

According to general relativity, a pair of black holes orbiting around each other lose energy through the emission of gravitational waves, causing them to gradually approach each other over billions of years, and then much more quickly in the final minutes. In a final fraction of a second, the two black holes collide and form one massive black hole. A portion of their combined mass is converted to energy, according to Einstein’s formula E=mc2, and this energy is emitted as a final strong burst of gravitational waves.

These are the gravitational waves that LIGO observed.

“With this discovery, we humans are embarking on a marvelous new quest: the quest to explore the warped side of the universe — objects and phenomena that are made from warped spacetime. Colliding black holes and gravitational waves are our first beautiful examples,” Caltech’s Kip Thorne said.

 

Source

February 1, 2016

Surprise! Monster Black Hole Found in Dwarf Galaxy

Filed under: Big Bang, Black Holes, Cool — bferrari @ 11:10 am

Size doesn’t matter…

This image shows a huge galaxy, M60, with the small dwarf galaxy that is expected to eventually merge with it. (NASA/Space Telescope Science Institute/European Space Agency)

This image shows a huge galaxy, M60, with the small dwarf galaxy that is expected to eventually merge with it. (NASA/Space Telescope Science Institute/European Space Agency)

Astronomers have just discovered the smallest known galaxy that harbors a huge, supermassive black hole at its core.

The relatively nearby dwarf galaxy may house a supermassive black hole at its heart equal in mass to about 21 million suns. The discovery suggests that supermassive black holes may be far more common than previously thought.

A supermassive black hole millions to billions of times the mass of the sun lies at the heart of nearly every large galaxy like the Milky Way. These monstrously huge black holes have existed since the infancy of the universe, some 800 million years or so after the Big Bang. Scientists are uncertain whether dwarf galaxies might also harbor supermassive black holes. [Watch a Space.com video about the new dwarf galaxy finding]

“Dwarf galaxies usually refer to any galaxy less than roughly one-fiftieth the brightness of the Milky Way,” said lead study author Anil Seth, an astronomer at the University of Utah in Salt Lake City. These galaxies span only several hundreds to thousands of light-years across, much smaller than the Milky Way’s 100,000-light-year diameter, and they “are much more abundant than galaxies like the Milky Way,” Seth said.

The researchers investigated a rarer kind of dwarf galaxy known as an ultra-compact dwarf galaxy; such galaxies are among the densest collections of stars in the universe. “These are found primarily in galaxy clusters, the cities of the universe,” Seth told Space.com.

This is an illustration of the supermassive black hole located in the middle of the very dense galaxy M60-UCD1. It weighs as much as 21 million times the mass of our Sun. Lying about 50 million light-years away, M60-UCD1 is a tiny galaxy with a diameter of 300 light-years — just 1/500th of the diameter of the Milky Way! Despite its size it is pretty crowded, containing some 140 million stars. Because no light can escape from the black hole, it appears simply in silhouette against the starry background. The black hole’s intense gravitational field warps the light of the background stars to form ring-like images just outside the dark edges of the black hole’s event horizon. Combined observations by the NASA/ESA Hubble Space Telescope and NASA’s Gemini North telescope determined the presence of the black hole inside M60-UCD1.

This is an illustration of the supermassive black hole located in the middle of the very dense galaxy M60-UCD1. It weighs as much as 21 million times the mass of our Sun. Lying about 50 million light-years away, M60-UCD1 is a tiny galaxy with a diameter of 300 light-years — just 1/500th of the diameter of the Milky Way! Despite its size it is pretty crowded, containing some 140 million stars. Because no light can escape from the black hole, it appears simply in silhouette against the starry background. The black hole’s intense gravitational field warps the light of the background stars to form ring-like images just outside the dark edges of the black hole’s event horizon. (Combined observations by the NASA/ESA Hubble Space Telescope and NASA’s Gemini North telescope determined the presence of the black hole inside M60-UCD1.)

Now, Seth and his colleagues have discovered that an ultra-compact dwarf galaxy may possess a supermassive black hole, which would make it the smallest galaxy known to contain such a giant.

The astronomers investigated M60-UCD1, the brightest ultra-compact dwarf galaxy currently known, using the Gemini North 8-meter optical-and-infrared telescope on Hawaii’s Mauna Kea volcano and NASA’s Hubble Space Telescope. M60-UCD1 lies about 54 million light-years away from Earth. The dwarf galaxy orbits M60, one of the largest galaxies near the Milky Way, at a distance of only about 22,000 light-years from the larger galaxy’s center, “closer than the sun is to the center of the Milky Way,” Seth said.

The scientists calculated the size of the supermassive black hole that may lurk inside M60-UCD1 by analyzing the motions of the stars in that galaxy, which helped the researchers deduce the amount of mass needed to exert the gravitational field seen pulling on those stars. For instance, the stars at the center of M60-UCD1 zip at speeds of about 230,000 mph (370,000 km/h), much faster than stars would be expected to move in the absence of such a black hole.

The supermassive black hole at the core of the Milky Way has a mass of about 4 million suns, taking up less than 0.01 percent of the galaxy’s estimated total mass, which is about 50 billion suns. In comparison, the supermassive black hole that may lie in the core of M60-UCD1 appears five times larger than the one in the Milky Way, and also seems to make up about 15 percent of the dwarf galaxy’s mass, which is about 140 million suns.

“That is pretty amazing, given that the Milky Way is 500 times larger and more than 1,000 times heavier than the dwarf galaxy M60-UCD1,” Seth said in a statement.

Astronomers have debated the nature of ultra-compact dwarf galaxies for years — whether they were extremely massive clusters of stars that were all born together, or whether they were the centers or nuclei of large galaxies that had their outer layers stripped away during collisions with other galaxies. These new findings hint that ultra-compact dwarf galaxies are the stripped nuclei of larger galaxies, because star clusters do not host supermassive black holes.

The researchers suggest M60-UCD1 was once a very large galaxy, with maybe 10 billion stars, “but then it passed very close to the center of an even larger galaxy, M60, and in that process, all the stars and dark matter in the outer part of the galaxy got torn away and became part of M60,” Seth said in a statement. “That was maybe as much as 10 billion years ago. We don’t know.”

Eventually, M60-UCD1 “may merge with the center of M60, which has a monster black hole in it, with 4.5 billion solar masses — more than 1,000 times bigger than the supermassive black hole in our galaxy,” Seth said in a statement. “When that happens, the black hole we found in M60-UCD1 will merge with that monster black hole.”

The astronomers suggest the way stars move in many other ultra-compact dwarf galaxies hints that they may host supermassive black holes, as well. All in all, the scientists suggest that ultra-compact dwarf galaxies could double the number of supermassive black holes known in the nearby regions of the universe. The researchers are participating in ongoing projects that may provide conclusive evidence for supermassive black holes in four other ultra-compact dwarfs.

The scientists detailed their findings in the Sept. 18 issue of the journal Nature.

Source

 

 

 

Older Posts »

The WordPress Classic Theme. Blog at WordPress.com.

Follow

Get every new post delivered to your Inbox.

Join 187 other followers