SpaceJibe

April 30, 2012

Maybe black holes don’t really exist

Filed under: Big Bang, Cool, Gamma Ray Bursts, Wierd — bferrari @ 5:48 pm
Artist's rendition of a Black Hole

Artist's rendition of a Black Hole

On March 28, 2011, the Swift Burst Alert Telescope detected a gamma-ray event that, in contrast with any previously observed gamma-ray burst, remained bright and highly variable for 48 hours. The gamma-ray emission was accompanied by bright x-ray emission that continued for two weeks. Astrophysicists attributed this event to the tidal disruption of a star by a black hole in the center of a distant galaxy. I would argue, however, that it would have been more accurate to describe this event as the tidal disruption of a star by a compact object. This distinction is important because the black-hole model has serious problems. The March event lends support to a heretical idea: that black holes do not exist.

The brightness of the gamma-ray and x-ray emissions suggests they are coming from a jet of charged particles moving at nearly the speed of light, but there is no obvious reason why the tidal disruption of star by a black hole should give rise to such a jet. In fact, the astrophysical community has been struggling to explain the observed ubiquity of jets. A leading idea is that, in the presence of a external magnetic field, electromagnetic energy is extracted from a rotating black hole and used to accelerate charged particles. The source of the field could be the disk of material swirling around the black hole. Yet disks do not generate magnetic fields with the right shape to produce well-collimated beams of particles.

More deeply, there are fundamental reasons why no compact object can be a black hole. The problem is that solutions of Einstein’s general-relativity equations that contain event horizons are inconsistent with quantum mechanics. For example, these spacetimes do not possess a universal time, which is required for quantum mechanics to make sense. Astrophysicists came to accept the idea of black hole because the gravitational collapse of sufficiently large masses cannot be stopped by ordinary means. But Pawel Mazur and I realized some time ago that quantum gravitational effects modify the collapse process.

Ordinary matter will be converted into vacuum energy when it is compacted to the point where general relativity predicts that an event horizon would begin to form. In contrast with ordinary mass-energy, vacuum energy is gravitationally repulsive, so it would act to stop the collapse and stabilize the object. At the surface of such objects, there is a transition layer between the large vacuum energy of the interior and the very small cosmological vacuum energy. In 2000 my colleagues and I suggested that this transition layer represents a continuous quantum phase transition of the vacuum. In 2003 George Musser wrote in Scientific American about the concept and suggested the name “crystal stars”. But I prefer the name “dark energy stars”.

Low-energy particles entering a dark energy star do not disappear, but follow a curved trajectory and emerge from the surface in much the same way that light does in a defocussing lens. On the other hand, the surface is opaque to elementary particles having energies exceeding a certain threshold. This is a result of the fact that near to a continuous phase transition, there are large fluctuations in the energy density, which in the case of a dark energy star means in the vicinity of the surface. Because the quarks inside protons and neutrons have energies exceeding the threshold for opaqueness, protons and neutrons falling onto the surface of a dark energy star will decay into positrons, electrons, and gamma-rays. In fact, one can make use of quantum chromodynamics (QCD) to predict the energy spectrum of these decay products [4]. The result is that for both the leptons and gamma-rays the spectrum extends up to energies of several MeV. Thus the model predictes that matter falling onto the surface of a dark energy star will result in the production of high-speed electrons and positrons and gamma-rays. The March 28 Swift event is perhaps the clearest evidence to date of this process.

Dark energy stars can readily explain jets. Their angular momentum is carried by spacetime vortices concentrated near the axis of rotation (arXiv.org/abs/gr-qc/0407033). As a result an external magnetic field will be wrapped around this vortex core in a barber-pole pattern. Injecting nucleon decay electrons and positrons into a rotating dark energy star will result in a highly collimated lepton jet. Such a jet has a structure very similar to that seen in the jets emerging from the centers of many distant galaxies. What is unique about the March 28 Swift event is that we can see for the first time that the formation of this kind of jet is completely in accordance with what would be expected in a dark energy star.

We arrive at the following picture. When matter from a nearby star hits the surface of a dark energy star, it is instantaneously converted into gamma-rays, electrons and positrons, the majority of which have energies in the 100 keV to few MeV range. It takes about a minute for these particles to fill the interior of the compact object and form a jet. Because the gamma-rays can scatter off the magnetically guided positrons and electrons, a burst of gamma-rays directed along the axis of rotation will initially accompany the jet. After the supply of gamma-rays is exhausted, a beamed emission of x-rays will persist as long as the supply of electrons and positrons lasts.

I doubt that this event alone will dislodge black holes as the astrophysical community’s standard model for compact objects. On the other hand, the unique properties of the March 28 event, together with other ways that the dark energy star theory might be tested in the near future, such as direct millimeter VLBI observations of the massive compact objects at the center of own and nearby galaxies, may soon allow the astrophysical community to see that black holes are really crystal stars.

About the Author: George Chapline is a theoretical physicist at the Lawrence Livermore National Laboratory. He led the team that demonstrated the first working x-ray laser, developed the concept of a “gossamer metal,” and has contributed to string theory.

Source

April 27, 2012

Astronomers find new planet capable of supporting life

Filed under: Cool, Exoplanets, Extraterrestrial Life, Gadgets, Life, Space Exploration — bferrari @ 8:34 pm

Astronomers have discovered their “holy grail” – a planet capable of supporting life outside our solar system.’

New 'life in space' hope after billions of 'habitable planets' found in Milky Way (NASA)

New 'life in space' hope after billions of 'habitable planets' found in Milky Way (NASA)

9:41AM BST 27 Apr 2012
The planet lies in what they describe as a ‘habitable zone’, neither too near its sun to dry out or too far away which freezes it.
And the discovery could help answer the question of whether we are alone in the universe, which has been plagued astronomers and alien fanatics for years.
Scientists found the planet, Gliese 667Cc, orbiting around a red dwarf star, 22 light years away from the earth.
Red dwarf stars are the most common stars in the neighbourhood of the sun, usually hosting planets called gas giants, which are not composed of rock matter.
Re-analysing data from the European Southern Observatory, the astronomers found Gliese 667Cc is a solid planet with roughly four and a half times the mass of Earth.
The University Göttingen and University of California scientists have calculated the planet recieves ten per cent less light from its red dwarf star than the Earth gets from the Sun.
As the light is in the infrared area, the planet still receives nearly the same amount of energy as the Earth, meaning water could be liquid and surface temperatures could be similar to ours.
Astronomers are hailing the plant as the ‘Holy Grail’ of discoveries, as 20 years ago scientists were still arguing about the existence of planets beyond our solar system.
Since the discovery of the first extrasolar planet in 1995, astronomers have confirmed the existence of more than 760 planets beyond the solar system, with only four believed to be in a habitable zone.
One of the most successful tools of planet hunters is the High Accuracy Radial Planetary Searcher (HARPS) telescope, which measures the radial velocity of a star.
Scientists using this telescope analyse the small wobbles in a stars motion caused by the gravitational response of a planet, determining the position and size of a planet indirectly.
Currently, they can detect planets which are 3-5 times the mass of the Earth but, in the future, they could detect planets which are smaller than twice the mass of Earth.
Steven Vogt, an astronomer from the University of California, said: “It´s the Holy Grail of exo-planet research to find a planet orbiting around a star at the right distance so it´s not too close where it would lose all its water and not too far where it would freeze.
“It´s right there in the habitable zone – there´s no question or discussion about it. It is not on the edge. It is right in there.”
Guillem Anglada-Escudé, of University Göttingen, Germany, said: “With the advent of new generation of instruments, researchers will be able to survey many dwarf stars for similar planets and eventually look for spectroscopic signatures of life in one of these worlds.”
Source

April 25, 2012

Apollo Computer Guidance Emulator on a Smart Phone

Filed under: Cool, Gadgets, Space Exploration, Space Ships — bferrari @ 8:51 am

Probably the coolest thing you’ll see today!

This Project

The purpose of this project is to provide a computer simulation of the onboard guidance computers used in the Apollo Program’s lunar missions, and to generally allow you to learn about these guidance computers.  Since this can be quite intimidating, we invite you to look at our “kinder and gentler” introductory page before immersing yourself in the full, gory detail presented by the bulk of the website.

The video clip above (courtesy of user Dean Koska and YouTube) illustrates some of the cute things you can do with Virtual AGC if you’re of a mind to do so.  Dean compiled our simulated AGC CPU to run on a Palm Centro—explaining that a regular Palm was too slow.  He created his own simulated display/keypad (DSKY), presumably aided by the developer info we provide for just such a desire.  (And sorry, Dean’s Palm port isn’t provided from our downloads page.  Dean has indicated that he may be able to provide it in the future, so if you want it you’ll just have to be patient.)

 

Source

April 23, 2012

Pentagon explains why hypersonic, Mach 20 drone failed

Filed under: Cool, Gadgets, Space Ships — bferrari @ 8:55 am

WASHINGTON – The Pentagon has finally released a report about what went wrong when its Hypersonic Technology Vehicle (HTV-2) failed just minutes into a test flight last year and barreled into the Pacific Ocean.
The unmanned, arrowhead-shaped aircraft, which one day could allow the US to strike anywhere across the globe in less than 60 minutes, was strapped to a rocket and launched from California’s Vandenberg Air Force Base last August.
The drone coasted at speeds of 13,000mph (21,000kmph) — 20 times the speed of sound — through the Earth’s atmosphere for less than three minutes before ultimately failing and switching into abort-mode just nine minutes into the flight. It splashed down short of its intended target near the Kwajalein Atoll in the Pacific.
Defense Advanced Research Projects Agency (DARPA) said an analysis of the crash showed that high speeds peeled off larger-than-expected portions of the vehicle’s skin.
‘HTV-2’s first flight test corrected our models regarding aerodynamic design.’
– Air Force Maj. Chris Schulz
Officials anticipated some of the outer shell would gradually wear away, but rapidly-forming gaps on the skin created strong shock waves around the HTV-2 and caused it to roll abruptly, the report said.
Military researchers, however, were hopeful that they could learn from the mistakes of the failed flight, especially after the first HTV-2 mission in April 2010 — which also terminated early — prompted successful adjustments to the craft’s aerodynamic design.
“HTV-2’s first flight test corrected our models regarding aerodynamic design within this flight regime,” Air Force Maj. Chris Schulz, DARPA program manager, said in a statement. “We applied that data in flight test two, which ultimately led to stable aerodynamically controlled flight.”
Schulz added that data collected during the second test flight “revealed new knowledge about thermal-protective material properties and uncertainties” for flights at such a high speed in our atmosphere. Going forward, that data will be used to modify how the vehicle’s outer shell responds to heat stress, DARPA said.

Source

April 22, 2012

Pentagon releases results of 13,000-mph test flight over Pacific

Filed under: Cool, Gadgets, Military, Space Ships — bferrari @ 6:39 pm
An artist's rendering of the Falcon Hypersonic Technology Vehicle 2. (Defense Advanced Research Projects Agency / April 20, 2012)

An artist's rendering of the Falcon Hypersonic Technology Vehicle 2. (Defense Advanced Research Projects Agency / April 20, 2012)

By W.J. Hennigan
April 20, 2012, 5:45 p.m.
The results are in from last summer’s attempt to test new technology that would provide the Pentagon with a lightning-fast vehicle, capable of delivering a military strike anywhere in the world in less than an hour.

In August the Pentagon’s research arm, known as the Defense Advanced Research Projects Agency, or DARPA, carried out a test flight of an experimental aircraft capable of traveling at 20 times the speed of sound.

The arrowhead-shaped unmanned aircraft, dubbed Falcon Hypersonic Technology Vehicle 2, blasted off from Vandenberg Air Force Base, northwest of Santa Barbara, into the upper reaches of the Earth’s atmosphere aboard an eight-story Minotaur IV rocket made by Orbital Sciences Corp.

After reaching an undisclosed altitude, the aircraft jettisoned from its protective cover atop the rocket, then nose-dived back toward Earth, leveled out and glided above the Pacific at 20 times the speed of sound, or Mach 20.

The plan was for the Falcon to speed westward for about 30 minutes before plunging into the ocean near Kwajalein Atoll, about 4,000 miles from Vandenberg.

But it was ended about nine minutes into flight for unknown reasons. The launch had received worldwide attention and much fanfare, but officials didn’t provide much information on why the launch failed.

On Friday, DARPA said in a statement that the searing high speeds caused portions of the Falcon’s skin to peel from the aerostructure. The resulting gaps created strong shock waves around the vehicle as it traveled nearly 13,000 mph, causing it to roll abruptly.

The Falcon, which is built by Lockheed Martin Corp., is made of durable carbon composite material, which was expected to keep the aircraft’s crucial internal electronics and avionics — only a few inches away from the surface — safe from the fiery hypersonic flight. Surface temperatures on the Falcon were expected to reach more than 3,500 degrees, hot enough to melt steel.

“The initial shock wave disturbances experienced during second flight, from which the vehicle was able to recover and continue controlled flight, exceeded by more than 100 times what the vehicle was designed to withstand,” DARPA Acting Director Kaigham J. Gabriel said in a statement. “That’s a major validation that we’re advancing our understanding of aerodynamic control for hypersonic flight.”

The flight successfully demonstrated stable aerodynamically controlled flight at speeds up to Mach 20 for nearly three minutes.

Sustaining hypersonic flight has been an extremely difficult task for aeronautical engineers over the years. While supersonic means that an object is traveling faster than the speed of sound, or Mach 1, “hypersonic” refers to an aircraft going five times that speed or more.

The Falcon hit Mach 20. At that speed, an aircraft could zoom from Los Angeles to New York in less than 12 minutes — 22 times faster than a commercial airliner. Take a look at what that looks like from the ground in the video below.

The August launch was the second flight of the Falcon technology. The first flight, which took place in April 2010, also ended prematurely with only nine minutes of flight time.

There aren’t any more flights scheduled for the Falcon program, which began in 2003 and cost taxpayers about $320 million.

Source

April 21, 2012

Cosmic ray source riddle mystery now even more mysterious

Filed under: Big Bang, Cool, Cosmology, Gamma Ray Bursts, Supernova, Wierd — bferrari @ 10:18 am

Eggheads stumped after killer gamma rays ruled out. Probably

Boffins are now even more puzzled about where high-energy cosmic rays come from after a new study showed that gamma ray bursts are probably not to blame.

Cosmic rays hitting Earth. (NSF/J. Yang)

Cosmic rays hitting Earth. (NSF/J. Yang)

Astroboffins only had two theories about what causes cosmic rays, which regularly penetrate Earth’s atmosphere: huge explosions out in space or supermassive black holes.

Now an international group, made up of no fewer than 250 physicists and engineers, says that the suspected gamma radiation bursts are unlikely to be the source for cosmic rays because they couldn’t find any neutrinos emitted from the mother-of-all space bangs they observed.

Cosmic rays are electrically charged subatomic particles with energies of up to one hundred million times more than those created in manmade accelerators such as the Large Hadron Collider. A gamma ray, meanwhile, is high-energy electromagnetic radiation that’s harmful to life.

Using the IceCube Neutrino Observatory at the South Pole in Antarctica, the boffins watched 300 gamma ray bursts (GRBs) while searching for the neutrinos that are believed to be linked with cosmic ray generation, and found none.

“The result of this neutrino search is significant because for the first time we have an instrument with sufficient sensitivity to open a new window on cosmic ray production and the interior processes of GRBs,” said IceCube spokesperson and University of Maryland physics professor Greg Sullivan in a canned statement.

“The unexpected absence of neutrinos from GRBs has forced a re-evaluation of the theory for production of cosmic rays and neutrinos in a GRB fireball and possibly the theory that high energy cosmic rays are generated in fireballs.”

ICE CUBE Neurtrino detector

ICE CUBE Neurtrino detector

The IceCube observatory spots neutrinos by the faint blue light they produce when they interact with ice. It’s basically a cubic kilometre of glacial ice equipped with more than 5,000 optical sensors.

This is the best way to “see” neutrinos because they can easily pass through other matter, including people or the whole planet, without leaving a trace.

Of course, being scientists, this study’s boffins aren’t willing to completely bin the GRB theory yet – instead they’re getting the IceCube detector to collect more data first.

“Although we have not discovered where cosmic rays come from, we have taken a major step towards ruling out one of the leading predictions,” said IceCube’s principal investigator Francis Halzen of the University of Wisconsin.

Source

April 11, 2012

Amazing photo captures robot cargo ship’s Space Station arrival

Filed under: Cool, Earth, Gadgets, Inner Solar System, Space Ships — bferrari @ 2:49 pm
Mar. 28, 2012: The European Space Agency's Automated Transfer Vehicle-3 (ATV-3) approaches the International Space Station. (NASA)

Mar. 28, 2012: The European Space Agency's Automated Transfer Vehicle-3 (ATV-3) approaches the International Space Station. (NASA)

Astronauts aboard the International Space Station captured an extraordinary photo of an unmanned European cargo ship as it docked to the orbiting outpost last week.

The European Space Agency’s third Automated Transfer Vehicle (ATV-3) launched into orbit on March 23, and arrived at the space station five days later, on March 28. The robotic cargo ship delivered about 7 tons of supplies, including water, oxygen, food, clothing, experiments and propellant.

The robotic ATV vehicles are designed to automatically dock to the space station. In this photo, the ATV-3 is approaching its parking spot at the Zvezda service module on the Russian segment of the orbiting complex.

The photo shows the ATV-3’s distinctive x-wing solar arrays bathed in light from the spacecraft’s sophisticated laser guidance system. The starry night sky and the glow of lights from Earth below make up the remarkable backdrop, as the two vehicles fly 240 miles (386 kilometers) above the planet.

American astronaut Don Pettit shared a version of the docking photo on Twitter the following day.

“ATV docks, breathing fire and bringing good stuff,” Pettit wrote under the name @astro_Pettit.

The cylindrical spacecraft is 35 feet (10.7 meters) long and 14.7 feet (4.5 meters) wide. The 13-ton cargo freighter is disposable and will remain attached to the space station for up to six months before it is loaded with garbage and sent to deliberately burn up as it re-enters Earth’s atmosphere.

While NASA reported a power issue on the ATV-3 last weekend and a communications glitch on Thursday (April 5), both issues have since been resolved, and all systems are functioning normally, agency officials confirmed.

The European ATV vehicles are typically named after prominent historical figures in astronomy or space exploration. The ATV-3 is named “Edoardo Amaldi,” after the famed Italian physicist who helped create the European Space Agency.

Source

April 6, 2012

Atom smasher collides particles at record energies

Filed under: Big Bang, Black Holes, Cool, Gadgets — bferrari @ 10:20 am
A simulation of a particle collision inside the Large Hadron Collider, the world's largest particle accelerator near Geneva, Switzerland. When two protons collide inside the machine, they create an energetic explosion that gives rise to new and exotic particles.

A simulation of a particle collision inside the Large Hadron Collider, the world's largest particle accelerator near Geneva, Switzerland. When two protons collide inside the machine, they create an energetic explosion that gives rise to new and exotic particles.

Physicists have started running the world’s largest particle accelerator at a new record energy and taking the first data from these ultra-powerful collisions.
Protons zoom around the 17 mile (27 kilometer) underground loop of the Large Hadron Collider below Switzerland and France, and then crash into each other, dissolving into new and sometimes exotic particles. Scientists have now sped up those protons a bit more, sending them speeding toward each other at energies of 4 teraelectron volts (TeV), creating a collision energy of 8 TeV — a new world record.
“The increase in energy is all about maximizing the discovery potential of the LHC,” Sergio Bertolucci, director of research at the LHC’s home lab CERN, said in a statement. “And in that respect, 2012 looks set to be a vintage year for particle physics.”
‘The increase in energy is all about maximizing the discovery potential of the LHC.’
– Sergio Bertolucci, director of research
Ramping up to higher energies means the LHC has a better chance of creating the rare and highlysought particles it was designed to search for. These include the long-theorized, but not-yet-detected Higgs boson particle, as well as the particles predicted by a physics theory called supersymmetry. If supersymmetric particles are discovered, they may offer an explanation for the mystery of dark matter, the invisible stuff thought to make up most of the matter in the universe.
LHC was opened in September 2008, but shut off nine days later after an accident damaged a number of its superconducting magnets. The accelerator was fixed and got back up and running a little over a year later, and has been operating steadily since. Starting in March 2010, the proton beams have been colliding at energies of 3.5 TeV, creating a smash of 7 TeV.
“The experience of two good years of running at 3.5 TeV per beam gave us the confidence to increase the energy for this year without any significant risk to the machine,” said Steve Myers, CERN’s director for accelerators and technology. “Now it’s over to the experiments to make the best of the increased discovery potential we’re delivering them!”
The increased energy should mean more Higgs boson particles are produced, if they exist. Already, scientists at two of the LHC’s experiments, Atlas and CMS, have seen promising indications of an excess of particles weighing around 125 GeV (gigaelectron volts) — potentially a sign of the Higgs. Yet physicists say they don’t have enough data to confirm a discovery with certainty.
The increased energy should up the chances of creating Higgs particles inside the machine, though it will also create more of the “background” particles that produce similar signatures, and must be weeded out from the data.
Ultimately, scientists plan to run particle beams through the LHC at an astounding 7TeVeach, producing collisions of a whopping 14TeV. To do that, they’ll need to refurbish the accelerator during a planned shutdown at the end of this year.

Source

April 3, 2012

Decommissioning the Space Shuttles

Filed under: Cool, Gadgets, Government Policies, Military, Space Ships, Stupidity — bferrari @ 7:51 am

Starting next month, NASA will begin delivering its four Space Shuttle orbiters to their final destinations. After an extensive decommissioning process, the fleet — which includes three former working spacecraft and one test orbiter — is nearly ready for public display. On April 17, the shuttle Discovery will be attached to a modified 747 Jumbo Jet for transport to the Smithsonian’s National Air and Space Museum in Virginia. Endeavour will go to Los Angeles in mid-September, and in early 2013, Atlantis will take its place on permanent display at Florida’s Kennedy Space Center. Test orbiter Enterprise will fly to New York City next month. Gathered here are images of NASA’s final days spent processing the Space Shuttle fleet.

In Orbiter Processing Facility-2 at NASA's Kennedy Space Center in Florida, the flight deck of space shuttle Atlantis is illuminated one last time during preparations to power down Atlantis during Space Shuttle Program transition and retirement activities, on December 22, 2011. Atlantis is being prepared for public display in 2013 at the Kennedy Space Center Visitor Complex. (NASA/Jim Grossmann)

In Orbiter Processing Facility-2 at NASA's Kennedy Space Center in Florida, the flight deck of space shuttle Atlantis is illuminated one last time during preparations to power down Atlantis during Space Shuttle Program transition and retirement activities, on December 22, 2011. Atlantis is being prepared for public display in 2013 at the Kennedy Space Center Visitor Complex. (NASA/Jim Grossmann)

Inside NASA's Orbiter Processing Facility-1 in Florida, among hundreds of signatures, technicians transfer seats to the middeck of space shuttle Discovery for installation, on February 14, 2012. (NASA/Jim Grossmann)

Inside NASA's Orbiter Processing Facility-1 in Florida, among hundreds of signatures, technicians transfer seats to the middeck of space shuttle Discovery for installation, on February 14, 2012. (NASA/Jim Grossmann)

 

In Orbiter Processing Facility-2, Lord Stanley's Cup sits in the flight deck of space shuttle Atlantis, on January 18, 2012. The Stanley Cup was awarded to the Boston Bruins after winning the 2011 National Hockey League Championship. Jeremy Jacobs, chairman and chief executive officer of Delaware North Companies and owner of the Boston Bruins, had brought the cup to Florida for Kennedy and Delaware North employees to view and take photographs. (NASA/Kim Shiflett)

In Orbiter Processing Facility-2, Lord Stanley's Cup sits in the flight deck of space shuttle Atlantis, on January 18, 2012. The Stanley Cup was awarded to the Boston Bruins after winning the 2011 National Hockey League Championship. Jeremy Jacobs, chairman and chief executive officer of Delaware North Companies and owner of the Boston Bruins, had brought the cup to Florida for Kennedy and Delaware North employees to view and take photographs. (NASA/Kim Shiflett)

 

Space shuttles Discovery and Endeavour stop outside Orbiter Processing Facility-3 (OPF-3) for a unique photo opportunity, on August 11, 2011. (NASA/Jim Grossmann)

Space shuttles Discovery and Endeavour stop outside Orbiter Processing Facility-3 (OPF-3) for a unique photo opportunity, on August 11, 2011. (NASA/Jim Grossmann)

 

See the other 35 images here.

IBM to study mysteries of Big Bang with world’s biggest telescope

Radio dishes that make up one part of South Africas KAT-7 array, a prototype for its 64-dish "MeerKAT" telescope, a precursor to the full SKA. (SKA)

Radio dishes that make up one part of South Africas KAT-7 array, a prototype for its 64-dish "MeerKAT" telescope, a precursor to the full SKA. (SKA)

A telescope so massive that it spans a continent won’t be any better than a pair of binoculars unless you can find a way to carry and sift through its data.

The Square Kilometer Array (SKA) is planet Earth’s next big science project. It won’t be operational until the next decade, and the group planning it hasn’t even settled on a location yet, shortlisting South Africa and Australia. But they’ve already hit a problem: It will generate on its own as much data as the entire Internet carries on a regular day.
What do you do with 1 billion gigabytes of data?

IBM on Monday announced Project Dome, a 32.9 million EURO, five-year collaboration with Astron, the Netherlands Institute for Radio Astronomy, toward “exascale” computing. The pair will work together to design new tech to analyze data from the massive telescope, which will effectively double the amount of data carried on the Internet. And they’re in a rush to find the answers, said Ronald Luijten, of IBM research in Zurich.

“We only have four years before we have to begin building the hardware,” he told FoxNews.com. “We know what needs to be done, we know how it needs to be done. But today we don’t know how to build it in an economic way.”
Gulp.

The technical challenges faced by the immense science project are mindboggling. After all, it will create 1 exabyte of data every day — 1 million terabytes, or 1018 bytes, enough raw data to fill 15 million 64GB iPods. How to carry it? How to sift through it? Where to store it?

“If you take the current global daily Internet traffic and multiply it by two, you are in the range of the data set that the Square Kilometer Array radio telescope will be collecting every day,” said Ton Engbersen, IBM Research — Zurich. “This is Big Data analytics to the extreme.”

The telescope will generate that data from thousands of receptors, spaced roughly 1 kilometer apart and linked across an entire continent. They’ll be arranged in five spiral arms like a galaxy, 3,000 50-foot-wide dishes that extend from a central core at least 1,860 miles (3,000 kilometers) — about the distance from New York City to Albuquerque, N.M.

To carry around the staggering amount of information this beast will generate, IBM will likely turn to fiber optic cables and advanced photonics, Luijten said.

“We’re going to look to see if photonics can be used in a much more innovative way. Electromagnetic waves must be converted to an optical signal for transmission to fibers. Can we sample those electromagnetic waves directly in optical form?” he told FoxNews.com.
Moving data around is one challenge; powering up radio dishes set in the remote reaches of Australia or Africa, where there are neither data nor power lines, is another problem.

“It can’t be so expensive that no one can afford to turn on the instrument,” he joked. And off-the-shelf computer chips — which are often lumped together by the hundreds or even thousands to power supercomputers — simply won’t work.
“We don’t expect that with commodity CPUs we can actually do a solution that will be good enough from a power viewpoint,” Luijten told FoxNews.com. One solution may lie in dedicated signal processors, chips from companies like nVidia designed initially for graphics, but very effective at math.

The other solution is so simple, you have to wonder why no one thought of it already: Build upwards.
“We can stack almost a hundred of these chips on top of each other” Chris Sciacca, a spokesman for IBM Zurich, told FoxNews.com.
“Ninety-eight percent of energy is consumed in moving data within high-end servers today,” Sciacca said. One way to solve that challenge is moving from a 2D to a 3D world, putting chips not next to each other on a circuit board, but stacking them like an Oreo cookie.

The challenge of Big Data is common to science; in fact, it’s a problem already addressed by one of the world’s other biggest science projects. The Large Hadron Collider (LHC), a giant atom smasher run by CERN in Europe, is already operating — and creates tremendous rafts of data. It’s not quite the same thing, Luijten said.

“At CERN, they create many big bangs within their proton accelerator … [whereas] the SKA guys are looking at the real thing,” Luijten pointed out. But both science projects faced similar challenges. Why not reuse CERN’s solution?
“They have a similar type of issue,” he admitted. “But, for instance, the amount of data produced at CERN is 15 petabytes per year, or 10 to 100 times less than what the SKA will produce.”

The LHC also operates in a convenient, 20-mile ring in Switzerland. The SKA will be spread across an entire continent to boost precision and sensitivity.

“They don’t have this global networking issue,” Luijten told FoxNews.com.

If everything comes together, the project has incredible potential, helping scientists gain a fundamental understanding of what happened at the Big Bang, about 13 million years ago. CERN’s atom smasher aims to recreate those conditions. The SKA aims to study them: The signals created at the big bang are still travelling at light speed in the universe, after all.

“This is a very exciting opportunity — probably a once-in-a-lifetime opportunity. It will have a fundamental impact on what we understand about how the universe was formed,” he said.

Source

Create a free website or blog at WordPress.com.