Skip to content

September 8 2012

September 10, 2012

8Sep2012

Newswire

 

With military at ‘turning point,’ defense chief Leon Panetta avoids bold moves

Washington Post

By Greg Jaffe, Published: September 3

For most of the past year, Defense Secretary Leon E. Panetta has stressed that the vast military complex over which he presides is at a “strategic turning point.”

A decade of grinding guerrilla war is drawing to a close. Defense budgets are shrinking. The implication is that major changes are coming to the military.

So far, however, Panetta has cut few major weapons programs and steered clear of any bold moves aimed at remaking the military for this new era. The watchword for Panetta’s tenure, senior defense officials said, has been “humble.”

“He’s told the service chiefs to be humble in their predictions of warfare,” one senior official said.

Panetta’s approach reflects a management style that throughout his career has placed a premium on consensus over major reforms and collegiality over bold thinking, said officials who have worked with him. “He has always run a happy, productive shop,” said former ambassador James Dobbins, who worked closely with Panetta in the Clinton White House.

Because he has not spent his career in the national security realm, Panetta has tended to rely more heavily on the Pentagon’s top generals for advice than his predecessors did, senior military officials said.

In an interview describing his defense strategy, Panetta said he has helped craft an approach that hedges bets against a range of potential enemies. “It really does provide maximum flexibility,” he said. “The military is going to be smaller, but it is going to be more agile, more flexible and more deployable so that it moves fast and stays on the cutting edge of technology.”

Panetta’s vision is notable for some of the big questions left unanswered. A highly touted promise to shift the military’s focus to Asia has produced little in the way of major new deployments. Nine months after it was unveiled, there is scant evidence of how it will be implemented.

“This is a time when you would expect an intense focus on where we want to go and what we want to be,” said Andrew Hoehn, a senior vice president at the Rand Corp. and a former Pentagon strategist. Hoehn said such a debate does not appear to be happening inside the Pentagon or in the presidential campaigns, which have largely ignored national security issues.

Although the war in Iraq has ended and troops are being withdrawn from Afghanistan, Panetta has not pressed the ground forces to conduct a tough and detailed examination of their performance in the two long and costly wars, said Eliot Cohen, a military historian at Johns Hopkins University and an adviser to Mitt Romney’s campaign.

In recent years, Army and Marine Corps officers have tended to blame their struggles on the State Department and other federal agencies, which were unable to provide the necessary help to rebuild the war-torn countries’ governments and economies.

Cohen said the finger-pointing has prevented the ground services from acknowledging their own shortcomings, such as their inability to produce a core of experts in the culture, politics, history and languages of the two countries where they have spent most of the past decade fighting.

Panetta said he would like to see the military do more in this area. “I think we have to look at the lessons that we draw, particularly from these last 10 years of war,” he said. “I’m not satisfied. I think more needs to be done.”

The Obama administration’s defense strategy, meanwhile, plays down the likelihood of the military fighting major counterinsurgency wars in the coming years. To that end, Panetta has ordered the Army to shrink to about 490,000 soldiers by 2017, a reduction of about 80,000 that will leave the force slightly larger than it was before Sept. 11, 2001.

A leadership contrast

A surprise pick to run the CIA in 2009, Panetta had spent most of his career as a congressman from California and a deficit hawk in the Clinton administration, including a stint as White House chief of staff.

Even after 21 / 2 years at the CIA and 14 months at the Pentagon, Panetta’s speeches tend to steer clear of the kinds of detailed policy prescriptions and tough questions that were routine under Defense Secretary Robert M. Gates, his immediate predecessor.

“Do we really need 11 [aircraft] carrier strike groups for another 30 years when no other country has more than one?” Gates asked a Navy audience in 2010. He also challenged the Marines to consider whether, in an era of increasingly precise cruise missiles, they would be called upon again to storm an enemy’s shore — a question that cuts to the core of the Marines’ identity. Gates’s goal was to encourage lower-ranking officers to challenge military pieties.

By contrast, Panetta sometimes sounds more like a congressman representing the “Pentagon district” than the leader of the world’s largest military. He talks frequently about his parents, who immigrated from Italy. And he regularly rails against the possibility that the Pentagon will have to absorb $500 billion in automatic cuts if Congress cannot agree on how to trim $1.2 trillion in government spending. The cuts, triggered under an arcane process known as sequestration, would come on top of an already mandated $487 billion in reductions.

“It’s mindless, and it will . . . do incredible damage to our national defense,” Panetta said last month in a speech in New York.

As he did during his days as a congressman, Panetta spends most weekends in California, commuting home on a military jet at a cost of more than $800,000 as of this spring, the latest figures available.

In the interview, Panetta played down the value of speeches that question the military’s most prized programs or press its officers to embrace new or unpopular ideas.

“I don’t think it’s smart to challenge the services publicly,” he said. “My style has always been to basically work with a team and have everybody feel part of the team.”

Generals approve

The Pentagon’s top generals say they appreciate Panetta’s less-confrontational style.

“Gates was trying to challenge all of us,” Gen. Ray Odierno, the Army’s chief of staff, said in an interview. “Secretary Panetta just operates differently. He has been very transparent with the chiefs and allowed us to participate much more than we have in the past.”

Panetta’s backers note that he is leading the Pentagon at a different time than Gates, who presided over a growing defense budget and could afford to raise tough questions without worrying that they would be used as fodder for cuts.

Panetta also faces a greater array of threats than did Gates, whose time in the Pentagon was dominated by Iraq and Afghanistan.

The current list of crises stretches from growing unrest in Syria and Iran’s nuclear ambitions to a new leader in North Korea and rising tensions between China and its neighbors around the South China Sea. He’s overseeing the withdrawal of U.S. forces from Afghanistan amid unrelenting Taliban attacks.

“For the first time in a decade, the urgent priority mission is not staring us in the face,” said Michele Flournoy, who recently stepped down as the Pentagon’s top policy official.

Panetta acknowledged the challenge of confronting myriad threats in an era of diminished resources. But he rejected criticism from those who say he lacks a vision or hasn’t pressed hard enough for change.

“What kind of world are we going to be dealing with in the future?” he asked. “I think it’s going to look a lot like what we’re looking at now.”

 

ORS Office organizing three new programs

by Michael P. Kleiman

377th Air Base Wing Public Affairs

 

8/30/2012 – KIRTLAND AIR FORCE BASE, N.M. — Perhaps the Operationally Responsive Space Office should consider changing its motto of “Employ. Deploy. Develop” to “Reaching the Reality of Rapid Response.”

In the past 14 months, the organization experienced success with its initial satellite, ORS-1, transitioned for operational use to U.S. Central Command in January, approximately six months after launch.

It also funded the liftoff and first 180 days of flight of the experimental Tactical Satellite-4, slated for transfer to U.S. Strategic Command less than one year after reaching orbit.

Not resting on its laurels, the ORS Office has implemented an aggressive strategy in designing, developing, launching and operating low-cost, rapid-reaction spacecraft through the ORS-2, ORS-3 and ORS-4 programs.

“Although ORS-1’s and TacSat-4’s operational performance in providing joint force commanders with enhanced battlespace awareness has exceeded expectations, our team’s focus has remained on producing affordable and quickly assembled and launched (within days to a week after receiving the warfighter’s request) satellites,” said Dr. Peter Wegner, ORS Office director. “Our three current programs (ORS-2, ORS-3 and ORS-4), will make huge strides in rapid response becoming a reality.”

Currently, in the development stage, ORS-2 will consist of a modular, rapidly configurable bus employing plug-and-play technology, as well as radar and electronic tactical support payloads.

The satellite, with a projected $20 million mission cost, will operate in a 45-degree inclination orbit 311 miles above the Earth.

Much of the ORS-2 program schedule has yet to be determined, but delivery of the modular bus has been planned for summer 2012.

Like its predecessor, ORS-2 will provide enhanced battlespace awareness to the JFC during its planned one-year mission.

On the other hand, project specifics for ORS-3 have been decided. Planned to launch from the Mid-Atlantic Regional Spaceport located at NASA’s Wallops Flight Facility, Wallops Island, Va., in August 2013, the satellite will feature a primary payload, the Air Force Space and Missile Systems Center’s Space Test Program Satellite-3 and 27 additional experiments comprised of free-flying systems and non-separating components. ORS-3 will employ CubeSat wafers, which enable secondary payloads to take advantage of excess lift capacity unavailable to the primary trial.

In addition, the third ORS mission, costing $27 million, will demonstrate launch vehicle flight safety architecture of the future through the Autonomous Flight Safety System payload, which uses launch vehicle orbital targeting and range safety planning processes to protect public safety from an errant launch vehicle during flight.

“I am really excited about the Autonomous Flight Safety System payload onboard ORS-3 because the apparatus will allow for real-time range safety calculations. For example, if the launch vehicle encounters an emergency immediately following liftoff, the AFSS will assist in immediately aborting the rocket, which ultimately prevents loss of life and property,” Wegner said. “Another unique aspect of ORS-3 is the 27 secondary experiments consisting of both separating (free-flying) and non-separating. Several universities and businesses have expressed interest in participating in the ORS-3 flight, which will use an integrated payload stack.”

Finally, the fourth ORS mission also has a definitive timetable with most pre-liftoff milestone dates established and the launch window no earlier than September 2013.

Using a three-stage sounding rocket known as Super Strypi will deliver to orbit an integrated payload stack of approximately 419 pounds, launched from the U.S Navy’s Pacific Missile Range Facility, Barking Sands, near Kekaha, Kauai County, Hawaii, and will operate in 97-degree orbit approximately 280 to 326 miles above the planet.

Like ORS-3, onboard payloads include the AFSS and CubeSat Wafers, which will accommodate an undetermined number of secondary experiments.

With a $10 to $12 million price tag, ORS-4 will function with decreased infrastructure comprised of the AFSS, global positioning system metric tracking, space-based telemetry relay, and automated flight planning.

“Since its inception in May 2007, the ORS Office has been committed 24/7/365 to providing space power focused on the timely satisfaction of Joint Force Commanders’ needs. Both ORS-1 and TacSat-4 have been resounding successes for us to build upon in achieving rapid, responsive and affordable space flight,” said Dr. Wegner. “I believe the next three ORS missions will enable us to accomplish that milestone. We are prepared to do so.”

 

Researchers find critical vulnerability in Java 7 patch hours after release

 

The new vulnerability allows a complete Java Virtual Machine sandbox escape in Java 7 Update 7, researchers from Security Explorations say

By Lucian Constantin

August 31, 2012 12:08 PM ET

IDG News Service – Security researchers from Poland-based security firm Security Explorations claim to have discovered a vulnerability in the Java 7 security update released Thursday that can be exploited to escape the Java sandbox and execute arbitrary code on the underlying system.

 

Security Explorations sent a report about the vulnerability to Oracle on Friday together with a proof-of-concept exploit, Adam Gowdiak, the security company’s founder and CEO said Friday via email.

The company doesn’t plan to release any technical details about the vulnerability publicly until Oracle addresses it, Gowdiak said.

Oracle broke out of its regular four-month patching cycle on Thursday to release Java 7 Update 7, an emergency security update that addressed three vulnerabilities, including two that were being exploited by attackers to infect computers with malware since last week.

Java 7 Update 7 also patched a “security-in-depth issue” which, according to Oracle, was not directly exploitable, but could have been used to aggravate the impact of other vulnerabilities.

The patching of that “security-in-depth issue,” which Gowdiak calls an “exploitation vector,” rendered all of the proof-of-concept (PoC) Java Virtual Machine (JVM) security bypass exploits previously submitted by the Polish security firm to Oracle, ineffective.

According to Gowdiak, Security Explorations privately reported 29 vulnerabilities in Java 7 to Oracle back in April, including the two that are now actively exploited by attackers.

The reports were accompanied by a total of 16 proof-of-concept exploits that combined those vulnerabilities to fully bypass the Java sandbox and execute arbitrary code on the underlying system.

The removal of the getField and getMethod methods from the implementation of the sun.awt.SunToolkit class in Java 7 Update 7 disabled all of Security Explorations’ PoC exploits, Gowdiak said.

However, this only happened because the “exploitation vector” was removed, not because all vulnerabilities targeted by the exploits were patched, Gowdiak said.

The new vulnerability discovered by Security Explorations in Java 7 Update 7 can be combined with some of the vulnerabilities left unpatched by Oracle to achieve a full JVM sandbox bypass again.

“Once we found that our complete Java sandbox bypass codes stopped working after the update was applied, we looked again at POC codes and started to think about the possible ways of how to fully break the latest Java update again,” Gowdiak said. “A new idea came, it was verified and it turned out that this was it.”

Gowdiak doesn’t know when Oracle plans to address the remaining vulnerabilities reported by Security Explorations in April or the new one submitted by the security company on Friday.

It’s not clear if Oracle will release a new Java security update in October as it previously planned. Oracle declined to comment.

Security researchers have always warned that if vendors take too much time to address a reported vulnerability it might be discovered by the bad guys in the meantime, if they don’t already know about it.

It happened on multiple occasions for different bug hunters to discover the same vulnerability in the same product independently and this is what might have also happened in the case of the two actively exploited Java vulnerabilities that were addressed by Java 7 Update 7.

“Independent discoveries can never be excluded,” Gowdiak said. “This specific issue [the new vulnerability] might be however a little bit more difficult to find.”

Based on the experience of Security Explorations researchers with hunting for Java vulnerabilities so far, Java 6 has better security than Java 7. “Java 7 was surprisingly much easier for us to break,” Gowdiak said. “For Java 6, we didn’t manage to achieve a full sandbox compromise, except for the issue discovered in Apple Quicktime for Java software.”

 

Study sees no nutritional edge in organic food

By Elizabeth Weise, USA TODAY

 

Organic products have no significant nutritional advantage over conventional foods, even though consumers can pay more for them, a new study finds.

 

The findings suggest that a key reason behind why many people buy organic products — a $28.6 billion market last year — may not be borne out by the science.

The four-year project began when two doctors wondered what advice they should give their families and patients about whether to buy organic or conventional foods.

“It became much larger than we expected,” says Crystal Smith-Spangler, a primary care doctor at Stanford University and lead author on the study appearing Tuesday in the journal Annals of Internal Medicine.

Eventually, 12 researchers looked at 240 studies conducted from 1966 to 2011 covering nutrient and contaminant levels in foods.

Among the findings:

•There were no significant differences in the vitamin content of organic and conventional fruits and vegetables. The studies looked specifically at vitamins A, C and E.

•Detectable pesticide residue was found in 7% of organic produce and 38% of conventional produce. However, only three studies found pesticide residue that exceeded maximum allowed limits in the European Union on organic or conventional produce.

As to how pesticides could be in organic product that must be grown without them by law, Smith-Spangler said it could either be long-lasting, now-banned pesticides in the soil or drift from nearby fields.

•Both organic and conventional foods were at similar risk for bacterial contamination.

A smaller study done in 2009 by Alan Dangour at the London School of Hygiene and Tropical Medicine with a European emphasis reached similar conclusions.

Organic produce often costs more than conventionally grown fruits and vegetables, but the differential varies widely. In 2009, it generally cost at least 25% more in Boston and San Francisco wholesale markets and sometimes was double the price, according to the most recent data from the U.S. Department of Agriculture. In rare cases, the organic food was equal or even a smidge lower in price.

Yet “there isn’t much difference between organic and conventional foods, if you’re an adult and making a decision based solely on your health,” said Dena Bravata, senior author of the paper and a physician at Stanford’s Center for Health Policy.

The data don’t fit well with consumers’ stated motivations for buying organic foods. A 2010 Nielsen study found that 76% bought them believing they are healthier, 53% because they allowed them to avoid pesticides and other toxins, 51% because they are more nutritious and 49% because organic farming is better for the environment.

Urvashi Rangan, a scientist at Consumers Union, publisher of Consumer Reports magazine, points out that organic farming originally started out as something designed to be better for the environment and for farmers.

“The health benefits really ended up being almost inadvertent, a nice fringe benefit” of farming in a sustainable way that benefits the planet, she says.

That once counter-culture trend is now mainstream. Organic foods make up 12% of all U.S. fruit and vegetable sales, according to Christine Bushway, CEO of the Organic Trade Association in Brattleboro, Vt. Organic products account for nearly 6% of the total U.S. market for dairy products, she says.

A statement from the association said the new findings confirm that organics reduce consumers’ exposure to pesticides and overuse of antibiotics can lead to higher levels of bacteria resistant to antibiotics in meat.

Different people will make different choices, Smith-Spangler says. “There may be some consumers who feel that any pesticide residue that’s detectable is unacceptable to them.” Others may trust government standards for allowable limits and be less concerned.

Pregnant women and families with small children are most likely to err on the side of caution, Rangan says. “Is it in some ways healthier to have less pesticides in your body, especially if you’re a kid? Absolutely,” she says. That fits with findings by the Organic Trade Association that 48% of parents say they believe organic foods “are healthier for me and my children.” Those surveys showed that 78% of U.S. families bought at least some organic foods in 2011.

Smith-Spangler says consumers should know there is overwhelming evidence that eating produce improves health — so whatever you choose to buy, load up on fruits and veggies.

 

A*STAR’s IME Develops Smallest Antenna That Can Increase WiFi Speed By 200 Times

August 28, 2012

Researchers from A*STAR’s Institute of Microelectronics (IME) have developed the first compact high performance silicon-based cavity-backed slot (CBS) antenna that operates at 135 GHz. The antenna demonstrated 30 times stronger signal transmission over on-chip antennas at 135 GHz. At just 1.6mm x 1.2mm, approximately the size of a sesame seed, it is the smallest silicon-based CBS antenna reported to date for ready integration with active circuits. IME’s innovation will help realise a wireless communication system with very small form factor and almost two-thirds cheaper than a conventional CBS antenna. The antenna, in combination with other millimetre-wave building blocks, can support wireless speed of 20 Gbps – more than 200 times faster than present day Wi-Fi[1], to allow ultra fast point-to-point access to rich media content, relevant to online learning and entertainment.

On the research breakthrough, Dr Hu Sanming, a key researcher from IME leading the antenna project, said, “The novel use of polymer filling enables >70% antenna size shrinkage and a record high gain of 5.68 dBi at 135 GHz. By filling the antenna cavity with polymer instead of air, we can achieve a flat surface for subsequent processing by standard technology that is amenable to mass production.”

“The team has also designed a three-dimensional (3D) architecture to integrate the antenna with active circuits to form a fully integrated wireless millimetre-wave system-in-package solution with high performance, reduced footprint and low electromagnetic interference,” commented Dr Je Minkyu, Principal Investigator of the Integrated Circuits and Systems Laboratory at IME.

Professor Dim-Lee Kwong, Executive Director of IME, said, “IME’s silicon-based 135 GHz integrative antenna technology and the proposed 3D architecture have immense commercial potential as it combines form with function that can be realised with standard mass production infrastructure. These salient features make our technology extremely attractive to product developers who are looking to capture emerging markets in millimetre-wave applications.”

[1] Wi-Fi technology for wireless communication based on newest IEEE standard 802.11 n that can support data rates of more than 100 Mbps.

About The Institute of Microelectronics (IME)
The Institute of Microelectronics (IME) is a research institute of the Science and Engineering Research Council of the Agency for Science, Technology and Research (A*STAR). Positioned to bridge the R&D between academia and industry, IME’s mission is to add value to Singapore’s semiconductor industry by developing strategic competencies, innovative technologies and intellectual property; enabling enterprises to be technologically competitive; and cultivating a technology talent pool to inject new knowledge to the industry. Its key research areas are in integrated circuits design, advanced packaging, bioelectronics and medical devices, MEMS, nanoelectronics, and photonics. For more information, visit http://www.ime.a-star.edu.sg.

About The Agency for Science, Technology and Research (A*STAR)
The Agency for Science, Technology and Research (A*STAR) is the lead agency for fostering world-class scientific research and talent for a vibrant knowledge-based and innovation-driven Singapore. A*STAR oversees 14 biomedical sciences, and physical sciences and engineering research institutes, and seven consortia & centre, which are located in Biopolis and Fusionopolis, as well as their immediate vicinity. A*STAR supports Singapore’s key economic clusters by providing intellectual, human and industrial capital to its partners in industry. It also supports extramural research in the universities, hospitals, research centres, and with other local and international partners.

SOURCE: The Agency for Science, Technology and Research (A*STAR)

http://www.rfglobalnet.com/doc.mvc/a-star-ime-smallest-antenna-increase-wifi-speed-by-times-0001?sectionCode=News&templateCode=EnhancedStandard&user=2753709&source=nl:34907

 

 

Does a Cyber-9/11 Loom?

http://www.cio.com

Jason Bloomberg, CIO

September 05, 2012

 

It should come as no surprise to anyone who follows the news in the U.S. that Congress hasn’t moved on cybersecurity. Forces on both sides of the aisle watered down, and then eventually nixed, this summer’s cybersecurity bill (also known as the Lieberman/Collins bill, sponsored by Senators Joe Lieberman, I-Conn. and Susan Collins, R-Maine). Sen. John McCain (R-Az.) and others proposed an alternative SECURE IT Act, which now languishes as Congress gears up for the November election.

For his part, President Barack Obama has been doing what he can to prepare the country for the possibility of an attack on our critical infrastructure. In July, he wrote an opinion piece in The Wall Street Journal calling for better exchange of information between government and industry in the event of a cyberattack.

It’s also possible, in the likely event of Congressional inaction, that the President might issue an Executive Order calling for such information exchange. Such an order, however, comes with its own political risks, as some consider such a move to impinge on the purview of Congress. The end result of all this political maneuvering, therefore, may be little or no action by the U.S. government on cybersecurity, at least until sometime in 2013.

Parties United in Cybersecurity Bill Opposition

The obvious question is whether such legislation would simply be too little, too late. The unfortunate fact of the matter is that we are already in the midst of a cyberwar. Corporations as well as government agencies are under constant attack from a range of opponents, both economic and political. Furthermore, the 2010 Stuxnet attack on Iranian nuclear infrastructure and the more recent Gauss attack on the Lebanese banking system show that the U.S. (or parties aligned with U.S. interests, Israel in particular) are willing and able to take an offensive posture in this Cyberwar.

The fact that the U.S. is willing to take such an offensive role raises the stakes for the defensive side of this battle. Not only do criminals continue to infiltrate our financial networks, as they have for years, but now we’re courting retaliation from nations who might very well launch their own cyberattacks against us. Cyberwar is heating up—and instead of strengthening our defenses, Congress dawdles.

The appropriate course for Congress to take remains unclear, regardless of your political perspective. The right wing, in the form of the U.S. Chamber of Commerce, shot down the Lieberman/Collins bill, citing onerous regulation, an expansion of government and interference with the open market. But in a classic case of odd bedfellows, the left wing, in the person of Sen. Al Franken, D-Minn., also had issues with the cybersecurity bill, as it called for private industry and government to share potentially private information about US citizens, thus impinging on the civil rights of Americans.

As Franken eloquently puts it, “Once a company gives the government cyberthreat information, the government shouldn’t be able to say, ‘Hey, this email doesn’t have a virus. But it does say that Michael is late on his taxes. I’m going to send that to the IRS.'”

Both sides present valid points. While appropriate, balanced regulation might be efficacious, and no one wants to see layers of expensive governmental bureaucracy or unnecessary interference with day-to-day commerce. No one wants to give up civil rights to improve security, either, especially when there’s no guarantee we will truly become more secure for having made such a deal with the devil.

While Republicans intend for SECURE IT to address the flaws of the Lieberman/Collins cybersecurity bill, it’s not clear whether the new bill will solve more problems than it causes. It goes out of its way to avoid introducing any new regulations that might be burdensome on the private sector and calls for no new regulatory authority—good for the private sector, perhaps, but at the risk of being toothless. As Sen. Ron Johnson (R-Wis.) puts it, “I have no faith that federal regulators should take the lead on cybersecurity. The regulatory process simply cannot keep up with the rapid pace of technology.”

If federal regulators shouldn’t take the lead on cybersecurity, then who should? The private sector—but only by voluntarily sharing classified information, not through regulation, the argument goes. Instead of relying on the government to address cyberthreats, SECURE IT lowers the liability that private sector companies would face, should they share information about potential threats with the government. This has the potential to lead to civil rights abuses, although the bill’s sponsors promise that won’t happen.

 

Cyber Villains Aren’t Waiting

From the perspective of the technology industry, all this political bickering comes across as dangerously parochial. The Internet, after all, knows no geographic borders, and the bad guys are all too willing to take advantage of the nationalist tunnel vision that all countries exhibit, including the U.S. While the government bickers over whether the Department of Homeland Security or the Department of Defense should take the lead in cybersecurity, villains unknown are planning…well, we don’t really know what they’re planning, do we?

By calling for greater protection for critical infrastructure such as power plants and water treatment plants via better communication between private industry and government, President Obama is doing all he can, given Congressional intransigence. Even if the President gets what he’s asking for, though, there’s still a serious concern that it won’t be enough, since there’s no way to know if an attacker is targeting the critical infrastructure on the President’s list.

After all, there are many different types of potential attackers with many different possible motives. Whether they are cybercriminals interested in financial gain, countries such as Iran or China mounting cyberespionage attacks, intellectual property thieves focusing on industrial espionage, or terrorists interested more in wreaking havoc than on any particular target, no single line of defense is sufficient. Furthermore, a cyberattacker might be a country, a company, a decentralized group of hackers (potentially spanning several countries) or even a single individual.

Their potential targets are similarly varied. A single, high-value target such as a power plant may come under attack, but protecting such infrastructure is an obvious priority. The result is that such attacks are difficult to mount, and thus are likely to be relatively rare. Far more common are attacks of convenience. Just as a burglar will avoid houses with alarm signs and instead seek homes with overgrown lawns sporting piles of newspapers, cybercriminals don’t really care whose money they steal. Most dangerous of all are the random terrorist attacks that are simply looking to cause mayhem.

Let’s also avoid the mistake of assuming that all cybertargets are technology targets. Perhaps the most effective cyberattacks have psychological targets that outstrip the intrinsic technology value of the target. For example, an attacker may not be able to take down Hoover Dam, but what if he could hack the Mars Rover Curiosity? The result would be outrage dismay, and anger—which may in fact be the intended goal all along.

The most nefarious attack of all, however, is on the American way of life—or, to be less nationalistic, the way of life in any free country. This most dangerous attack need not even take place. If the mere threat of cyberattack causes us to reduce civil liberties for our citizens, we have lost something immensely precious. For any attackers seeking to target precisely those liberties that make free countries free, we risk Congress itself becoming their most powerful weapon.

Achieving an adequate level of cybersecurity while balancing business needs and civil liberties is therefore a complex, difficult challenge. We can only hope that our government—as well as other governments around the world—rise to the challenge in time. The alternative, we fear, is continued complacency until a single attack or cluster of attacks is so damaging, so traumatic that the entire world changes its perspective on the cyberwar in progress. Let’s not forget the most valuable target for such a cyber-9/11 is our way of life itself.

 

Jason Bloomberg is the president of ZapThink, a Dovel Technologies company. Bloomberg focuses on enterprise architecture, SOA and cloud computing. Follow everything from CIO.com on Twitter @CIOonline, on Facebook, and on Google +.

 

 

How to Check If Your Apple Device ID Was Compromised

September 4, 2012 by Samantha Murphy 24

 

A hacker group called AntiSec said it has compromised 12 million Apple iOS Unique Device IDs (UDIDs) and personal information from Apple product owners — and there’s a good chance your iPhone, iPad or iPod Touch devices could be at risk.

Apple Unique Device Identifiers (UDID) — which is a sequence of 40 letters and numbers specific to an Apple device — don’t contain too much information by themselves, but when coupled with other information such as iTunes passwords, billing addresses and payment data, it could pose some risks for users.

AntiSec allegedly posted one million of the hacked IDs on the site Pastebin, along with a detailed description of how the hackers allegedly obtained the IDs from the FBI. The hackers said the data was taken from the laptop of a FBI staffer.

“During the second week of March 2012, a Dell Vostro notebook, used by Supervisor Special Agent Christopher K. Stangl from FBI Regional Cyber Action Team and New York FBI Office Evidence Response Team was breached using the AtomicReferenceArray vulnerability on Java, during the shell session some files were downloaded from his Desktop folder one of them with the name of ‘NCFTA_iOS_devices_intel.csv’ turned to be a list of 12,367,232 Apple iOS devices including Unique Device Identifiers (UDID), user names, name of device, type of device, Apple Push Notification Service tokens, zipcodes, cellphone numbers, addresses, etc,” claims Antisec.

It’s uncertain at this time what, if true, the FBI and the DOJ were doing with 12 million UDIDs. Note: The FBI has since denied that it was hacked and said it never obtained any files with Apple UDIDs.

According to a spokesperson for password security firm LastPass, identity theft could result from the situation.

“The biggest concern is that the Pastebin indicates that UDID, user name, name of the device, push tokens, zipcodes, cell phone numbers, address were part of the original leak. That’s not what was publicly posted, but it’s clear that user data is floating out there,” LastPass said. “Knowing this personal information, what Apple devices you have and their IDs, the most immediate concern is identity theft.”

Another possible threat is social engineering: “If someone knows you have an iPhone and has your cell phone number, address, the UDID of your device, a phone call from a person purporting to be from Apple with information only Apple should have asking for even more information could be an issue,” LastPass said.

As of May 2012 — and originally announced in September 2011 — Apple started rejecting apps that keep track of devices via its unique UDID due to security concerns. This means that the risk has been significantly trickled off in the past few months.

“Since AntiSec removed all the personal data from the data they released, this hack doesn’t present much risk to end users,” said Andrew Storms, director of security operations for nCircle, a compliance auditing firm that works with companies such as Facebook and Mastercard. “UDIDs in isolation aren’t a big deal. In fact, Apple used to permit apps to spew UDIDs all over the place, so there’s a lot of UDID data already in the public domain. For awhile, there were a lot of apps using UDID and personal data to track users activity and selling it to advertisers.”

The good news is that it’s easy to check if your Apple product is among those compromised. First, you will need to learn your Apple device’s UDID. To do so, plug your device into your computer and launch iTunes. On the left side of the screen, the device should pop up — click to open it. Specs such as iPhone name, capacity and a serial number should appear. Clicking on the serial number should make the UDID appear. The website WhatsMyUDID.com has a graphical tutorial for those that are confused. Users can also download various apps in the App Store to find and email their UDID.

Want to see if you’re affected? LastPass has set up a secure tool that allows you to check to see if your iPhone UDID information was among the one million leaked.

However, if your device’s UDID doesn’t pop up, it could still be among the other millions compromised and not posted online. Users can also search for their device using the first five digits of the UDID using this site.

 

But according to Storms, you may be out of luck if your device’s UDID has been leaked: “If your UDID has been leaked in this hack, there’s not much you can do unless you want to spring for a new phone,” Storms told Mashable. “It’s pretty likely that your UDID is already in the public domain.”

Meanwhile, LastPass advises that if your Apple device ID was on the list, “you should strongly consider signing up for a credit monitoring service.”

 

Hillary Clinton In China, Talks Military In South China Sea

Forbes

9/04/2012 @ 4:55PM |252 views

Secretary of State Hillary Clinton arrived in Beijing on Tuesday, hoping cooler heads will prevail over rising tensions between the two countries due to a growing American military presence in the region.

China warned Washington this week not to get involved in South China Sea territorial disputes, loaded, of course, with oil and gas and have set China against the Philippines and Japan who both claim rights to the tiny energy rich islands in the sea. China says no-can-do to that. In China’s view, the South China Sea is named after them, and belongs to them, including its natural resources under water.

News media over the last three days ordered Washington to “shut up” and accused the U.S. of divide and conquer strategies in the region.

So Hillary Clinton has been parachuted in to make amends. Both Washington and Beijing want to get along, but are better frenemies than friends.

This is the Secretary of State’s second trip to China this year. On Tuesday evening, Clinton met with China’s Foreign Minister Yang Jiechi. Yang said during a press conference with local media that both countries strive for peace in the region.

China vows to enhance the partnership based on “mutual respect, mutual benefit and win-win cooperation”, Yang said.

For her part, Clinton also spoke kindly of her Chinese counterparts.

“Washington continues to stress the importance of the practical cooperation that underlies our comprehensive relationship. We’re committed to building a cooperative partnership with China, it is a key aspect of our rebalancing in the Asia-Pacific,” she was quoted saying in a China Daily story Tuesday.

 

 

Pentagon says ex-SEAL book contains secrets

ArmyTimes

By Kimberly Dozier and Lolita C. Baldor – The Associated Press

Posted : Tuesday Sep 4, 2012 14:09:30 EDT

 

A former Navy SEAL’s insider account of the raid that killed Osama bin Laden contains classified information, the Pentagon said Tuesday, and the admiral who heads the Naval Special Warfare Command said details in the book may provide enemies with dangerous insight into secretive U.S. operations.

Rear Adm. Sean Pybus told his force Tuesday that “hawking details about a mission” and selling other information about SEAL training and operations puts the force and their families at risk.

“For an elite force that should be humble and disciplined for life, we are certainly not appearing to be so,” Pybus wrote in a letter to the roughly 8,000 troops under his command. “We owe our chain of command much better than this.”

The letter was obtained by The Associated Press.

At the Pentagon, press secretary George Little said that an official review of the book, “No Easy Day,” determined that it reveals what he called “sensitive and classified” information. He was not more specific but said the author was required to submit the book to the Pentagon before publication for a formal review of potential disclosures of such information.

“When you have special operations units that perform these missions, there are tactics, techniques, and procedures, not to mention human life, that are in play,” Little said. “And it is the height of irresponsibility not to have this kind of material checked for the possible disclosure of classified information.”

He told reporters during a briefing that the Pentagon is still reviewing what legal options should be taken against the author.

If the Pentagon determines the bin Laden book does disclose classified secrets, the government could consider bringing federal criminal charges against Bissonnette. The potential charges and penalties would depend largely on what type of secrets were disclosed.

Pybus, in his letter, was more direct, saying that, “We must immediately reconsider how we properly influence our people in and out of uniform NOT to seek inappropriate monetary, political, or celebrity profit from their service” with the SEALS.

“We all have much to gain or lose,” he said. “In the weeks ahead, we will be taking actions to meet this challenge, and I appreciate your leadership and support of our community in this effort.”

Last week, Adm. William McRaven, head of U.S. Special Operations Command, warned his troops that he would take legal action against anyone found to have exposed sensitive information that could cause fellow forces harm. The Naval Special Warfare Command is a unit within U.S. Special Operations Command.

A lawyer for author Matt Bissonnette, who wrote under the pseudonym Mark Owen, has disputed that he was legally obliged to have the book screened before publication.

Bissonnette’s co-author Kevin Maurer said in a statement Tuesday that Bissonnette “was meticulous about adhering to his desire to never do anything to undermine the SEALs’ mission or put his former colleagues in harm’s way.”

Little would not say what damage may result from the book’s revelations and he declined to point to any specific portions of the book that contain material that would be considered a violation and a release of classified information.

He said the Pentagon did not try to stop the public release of the book this week in part because there wasn’t much time.

“Pre-release copies of the book were already being circulated around,” Little said. “So the practical effect of requesting that the publisher withhold release of the book just wasn’t an available option.”

He added that the Pentagon also has not taken steps to stop the book from being sold on military installations. It’s not the Pentagon’s practice, Little said, “to get into the business of deciding what and what does not go on bookshelves in military exchanges. But that doesn’t mean in any way, shape or form that we don’t have serious concerns about the fact that this process of pre-publication review was not followed.”

The book, which was published by Dutton, an imprint of Penguin Group (USA), was No. 1 on Amazon’s best seller list Tuesday, which was its official release day. The initial print run was 575,000 copies and publication of the book was moved up from Sept. 11 to Sept. 4 amid a flurry of reports about the book last week.

Jeh Johnson, the Pentagon’s top lawyer, notified Bissonnette last Thursday that the Pentagon believes he is in “material breach and violation” of two nondisclosure agreements and of a related document he signed upon leaving active duty in April 2012.

In response, Robert D. Luskin of the law firm Patton Boggs wrote to Johnson on Friday that his firm is representing Bissonnette and asserting that he is not in breach of his nondisclosure agreements.

The Justice Department could go after the profits of the book in a civil proceeding if it is determined that he violated the nondisclosure agreement by not getting the book pre-cleared.

Associated Press National Security Writer Robert Burns contributed to this report.

 

Good Riddance La Niña! Hello El Niño!

Agweb.Com

September 5, 2012

By: Fran Howard

With La Niña—often blamed for the worst drought in decades—now over, it’s time to look ahead at El Niño.

 

According to Klaus Wolter, research associate with the Cooperative Institute for Research in Environmental Sciences at the University of Colorado Boulder, El Niño has already begun. El Niño, which is La Niña’s opposite, occurs when the Pacific equatorial surface waters warm up.

Wolter expects the current El Niño to be a mild one, but that does not mean all of its impacts will necessarily be mild. “In 1997 when we had a very big El Niño, everyone was betting their future commodity money on Australia having a failed wheat crop,” says Wolter. “That didn’t happen.” The 1997-98 El Niño, one of the worst ever, was about three times the expected strength of the current El Niño.

During El Niño events, most of eastern Australia, where the bulk of the nation’s wheat is produced, is typically much drier than normal. Sections of eastern and northern Brazil, where some commercial crop enterprises are located, and India also tend to experience widespread drought conditions, whereas Argentina, Chile, and southern Brazil are more prone to flooding.

In the United States, the southern tier of the country as well as California tend to receive much heavier winter storms than usual in El Niño years, while the Upper Great Plains and Upper Midwest tend to be warmer than usual. A warmer-than-usual winter across the Corn Belt could help rebuild soil moistures if unfrozen soils can absorb precipitation.

“It’s not like we have an El Niño so everything will be fine,” says Mike Timlin, regional climatologist with the Midwestern Regional Climate Center, Champaign, Illinois. “It isn’t a cure all for the drought.” He says that the second year of a multi-year La Niña is associated with drought in the middle of the United States, but saying La Niña was the cause of the drought is pretty shaky. “I would be more concerned if we were going to into another year of La Niña. At least we are not saying, ‘Oh boy! We are having a bounce-back La Niña.’ For the drought to ease, though, the high pressure ridge that has set up over the United States needs to break down.”

According to the National Weather Service, the drought far from over. Drought is expected to persist or worsen across much of the United States into November. Kansas, Nebraska, Missouri, Oklahoma, West Texas, Wyoming, Colorado, Nevada, Utah, southern Iowa, western Illinois, and eastern New Mexico will all see worsening conditions, while Indiana and much of eastern Illinois will see improvement.

Elsewhere in the world, the Middle East from Iran to Afghanistan tends to be wetter than usual in an El Niño year, while impacts in China are mixed, Wolter says.

Over the centuries, some experts have blamed El Niño events like the sinking of the Titanic in 1912 and the heavy snowfall that explorers Lewis and Clark encountered while crossing of Montana and Idaho’s Bitterroot Mountains in the winter of 1804-05.

The 1997-98 El Niño was blamed for the death of 24,000 and economic damage that reached $34 billion. That El Niño caused drought in Southeast Asia, severe winter storms along the West Coast of the United States, floods in East Africa, and a mix of floods and drought in China.

 

Retirement applications increase in August

http://www.federaltimes.com/article/20120905/BENEFITS/120905001

Federal Times

Sep. 5, 2012 – 02:06pm |

By Stephen Losey |

 

Nearly 9,000 federal employees applied for retirement in August — the most since January, according to the Office of Personnel Management.

So far this year, 74,725 federal employees have applied for retirement. That is nearly 2 percent more than the 73,585 employees who applied for retirement in the first eight months of 2011.

By the end of 2011, federal retirements had increased 24 percent over the previous year.

OPM received 8,973 retirement claims in August — 12 percent more than the 8,000 it expected last month, and nearly 15 percent more than retired in August 2011. August had the second-highest number of retirements in 2012, second only to January, which traditionally records a massive number of retirements each year.

But even though retirements are increasing, OPM was able to keep its processing rate up last month and cut its backlog of unprocessed claims from 44,679 to 41,787 — a 6 percent decrease.

OPM processed 11,865 claims in August. That is a decrease from July’s 12,304 claims processed, but still above the 11,500 claims OPM expected to process last month. And August was the third most-productive processing month this year.

OPM earlier this year enacted several reforms to try to fix its sluggish pension processing system. Those changes included hiring and training 56 new legal administrative specialists to process more cases, overhauling workflow processes to give those specialists more time to work on cases, and working more closely with agencies to help them submit retirement application packages with fewer missing documents or other mistakes.

 

Earlier this month, OPM issued guidance to agencies with tips on how to submit “healthy” retirement packages. And on Sept. 12, OPM will hold a training session on preparing those packages for human resources officials who handle retirements.

 

 

Pentagon OK with selling US drones to 66 countries

NBC.com

By Doug Palmer and Jim Wolf, Reuters

 

WASHINGTON — As many as 66 countries would be eligible to buy U.S. drones under new Defense Department guidelines but Congress and the State Department, which have a final say, have not yet opened the spigots for exports, a senior Pentagon official said on Wednesday.

The 66 countries were listed in a Defense Department policy worked out last year to clear the way for wider overseas sales of unmanned aerial systems, as the Pentagon calls such drones, said Richard Genaille, deputy director of the Pentagon’s Defense Security Cooperation Agency. He did not name them.

“We don’t really have a comprehensive U.S. government policy” on such exports, he told an industry conference called ComDef 2012. “It hasn’t moved quite as fast as we would like, but we’re not giving up.”

Northrop Grumman Corp chief executive Wes Bush on Wednesday praised the Obama administration for what he described as significant moves to boost arms exports, but voiced frustration at delays in codifying them in a new export policy.

“I wish we were further along in getting that done. It’s slow, it’s painful, but we’re doing the right things to move in that direction,” Bush told Reuters.

U.S. arms makers are looking abroad to help offset Pentagon spending cuts spurred by U.S. deficit-reduction requirements.

Northrop Grumman’s ability to boost its overseas arms sales, which now account for less than 10 percent of its overall revenues, hinges largely on streamlined export controls, Bush said.

Counterterrorism advisor Jon Brennan outlined the use of drones, arguing that it’s legal and has reduced the ability of al-Qaida to attack the U.S. NBC News investigative reporter Bob Windrem and The National Journal’s Yochi Dreazen discuss.

 

Complex web of regulations

U.S. defense and high-technology exporters have long complained about the complex web of regulations governing exports of weapons and “dual-use” goods that have both civilian and military applications. They believe the rules disadvantage them versus foreign competitors.

Of particular concern to Northrop Grumman are restrictions on exports such as the company’s high-altitude Global Hawk surveillance planes.

The New York Times’ Elisabeth Bumiller recently reported on the individuals responsible for flying drone planes, traveling to Hancock Field Air National Guard Base near Syracuse, New York to speak with pilots flying drones in Afghanistan.

The administration last year began informally consulting Congress on plans to sell Global Hawk to South Korea before withdrawing the proposed sale for reasons that have not been publicly disclosed.

Japan, Singapore and Australia also have shown interest in acquiring the aircraft, a Northrop Grumman spokeswoman told Reuters last year.

Bush said that failure to allow such exports could spark a repeat of the 1990s, when strict curbs on U.S. commercial satellite sales prompted other countries to develop rival hardware and software. Those efforts eventually eroded the market share of U.S. satellite producers from more than 70 percent to just around 25 percent.

“The consequences of the decisions that were made in the early ’90s were devastating for the US industrial base, and ultimately did nothing to enhance security, and in fact, were detrimental to our security,” he said.

 

Overhaul of munitions list

The Obama administration, over the objections of some Republicans in Congress, is aiming to create a single list of items subject to export controls overseen by a single licensing agency, instead of the two separate lists now administered by the State Department and the Commerce Department.

Jim Hursch, director of the Defense Department’s Defense Technology Security Administration, speaking at the ComDef event, said the administration was well into the overhaul but still had significant work to do.

Government agencies, as interim steps toward creating the single unified list, have worked their way through the 21 categories of the U.S. Munitions List administered by the State Department to see what items can be moved to the Commerce Department’s Commercial List, Hursch said.

“We’ll see what happens in November and what the victors of that election want to do to move forward on that,” Hursch said.

Defense Secretary Leon Panetta says if budget cuts hit the Department of Defense, it will be disastrous. Pentagon Assistant Secretary of Defense for Public Affairs George Little joins MSNBC’s Chris Jansing to discuss.

Beth McCormick, deputy assistant secretary for defense trade and regional security, said she hoped the reforms would continue whether President Barack Obama is reelected on November 6 or Republican challenger Mitt Romney.

“Regardless of what happens in November, we should continue this work and bring it closure,” McCormick said.

The Obama administration has already put proposed revisions to nine categories of the munitions lists out for public comment and faces some hard decisions moving ahead.

“There are some categories that by their basic nature are very, very difficult,” including one that encompasses both night-vision technology and fire control, she said.

In deciding what items to move to the commercial list, “we obviously have to think about the type of technology that we use on the battlefield, where obviously the control of the night has been something that’s been very, very important to us,” McCormick said.

Kevin Wolf, assistant secretary of Commerce for export administration, said moving an item from the munitions list to the commercial list did not mean it was “decontrolled.”

It does give the U.S. government more flexibility in allowing exports to close allies, while maintaining a strict arms embargo on other countries such as China, he said.

 

 

‘Extraordinary increase’: US arms sales make up most of global market

NY Times

By Thom Shanker

Washington

 

Weapons sales by the United States tripled in 2011 to a record high, driven by major arms sales to Persian Gulf allies concerned about Iran’s regional ambitions, according to a new study for Congress.

Overseas weapons sales by the UnitedStates totaled $66.3 billion last year, or more than three-quarters of the global arms market, valued at $85.3 billion in 2011. Russia was a distant second, with $4.8 billion in deals.

The American weapons sales total was an “extraordinary increase” over the $21.4 billion in deals for 2010, the study found, and was the largest single-year sales total in the history of United States arms exports. The previous high was in fiscalyear 2009, when American weapons sales overseas totaled nearly $31 billion.

A worldwide economic decline had suppressed arms sales over recent years. But increasing tensions with Iran drove a set of Persian Gulf nations — Saudi Arabia, the UnitedArabEmirates and Oman — to purchase American weapons at record levels.

These Gulf states do not share a border with Iran, and their arms purchases focused on expensive warplanes and complex missile defense systems.

UN to report Iran adds nuclear centrifuges

The report was prepared by the nonpartisan Congressional Research Service, a division of the LibraryofCongress. The annual study, written by Richard F. Grimmett and Paul K. Kerr and delivered to Congress on Friday, is considered the most detailed collection of unclassified arms sales data available to the public.

The agreements with SaudiArabia included the purchase of 84 advanced F-15 fighters, a variety of ammunition, missiles and logistics support, and upgrades of 70 of the F-15 fighters in the current fleet.

Sales to Saudi Arabia last year also included dozens of Apache and Black Hawk helicopters, all contributing to a total Saudi weapons deal from the United States of $33.4 billion, according to the study.

The United Arab Emirates purchased a Terminal High Altitude Area Defense, an advanced antimissile shield that includes radars and is valued at $3.49 billion, as well as 16 Chinook helicopters for $939 million.

Oman bought 18 F-16 fighters for $1.4 billion.

Most bought by developing nations

In keeping with recent trends, most of the weapons purchases, worth about $71.5 billion, were made by developing nations, with about $56.3 billion of that from the United States.

Other significant weapons deals by the United States last year included a $4.1 billion agreement with India for 10 C-17 transport planes and with Taiwan for Patriot antimissile batteries valued at $2 billion — an arms deal that outraged officials in Beijing.

 

To compare weapons sales over various years, the study used figures in 2011 dollars, with amounts for previous years adjusted for inflation to provide a consistent measurement.

A policy goal of the United States has been to work with Arab allies in the Persian Gulf to knit together a regional missile defense system to protect cities, oil refineries, pipelines and military bases from an Iranian attack.

The effort has included deploying radars to increase the range of early warning coverage across the Persian Gulf, as well as introducing command, control and communications systems that could exchange that information with new batteries of missile interceptors sold to the individual nations.

The missile shield in the Persian Gulf is being built on a country-by-country basis — with these costly arms sales negotiated bilaterally between the United States and individual nations.

This story, “U.S. Arms Sales Make Up Most of Global Market”, originally appeared in The NewYorkTimes.

 

 

U.S. says 49.33 percent oil output shut in U.S. Gulf due to Isaac

Wed Sep 5, 2012 2:20pm EDT

 

(Reuters) – U.S. regulators said 49.33 percent of daily oil production and 25.71 percent of daily natural gas output in U.S.-regulated areas of the Gulf of Mexico remained shut on Wednesday due to Hurricane Isaac, remnants of which re-entered the northern Gulf Wednesday.

The amount of shut oil output was down 2.18 percentage points and closed natural gas production fell 3.29 points from a day earlier.

The U.S. Bureau of Safety and Environmental Enforcement said 680,749 barrels per day (bpd) of oil production was still shut, down from 710,866 bpd on Tuesday, and 1.157 billion cubic feet (bcf) per day in natural gas output was closed, down from 1.309 bcf per day on Tuesday.

Those figures are expected to fall in the coming days.

Isaac came ashore as a hurricane on August 28 and its slow movement last week across Louisiana delayed the restart of production. (Reporting by Erwin Seba; Editing by Gerald E. McCormick)

 

Census mobile app a hit

Washington Post

By Carol Morello, Published: September 6

It may be premature to call it a runaway hit, but in the first month since the Census Bureau released its first mobile application, it’s been downloaded more than 32,000 times.

The app, called America’s Economy, provides updated statistics from the Census Bureau, the Bureau of Economic Analysis and the Bureau of Labor Statistics. It includes 16 monthly economic indicators, such as house sales, personal income, international trade, Gross Domestic Product and the unemployment rate.

“The economist in me finds this app a cool new tool,” wrote Tom Mesenbourg, acting director of the Census Bureau, in the blog he took over when his predecessor, Robert M. Groves, stepped down last month to become provost at Georgetown University.

The app is the latest step the census has taken to use the Internet more in both collecting and disseminating statistics. Last month, the White House cited the Census Bureau as an example of a federal agency that is “making great strides towards putting a solid foundation for a 21st Century digital government in place.”

In January, the census will offer people who receive the American Community Survey the option to answer over the Internet instead of filling out a form and dropping it in the mail. The ACS, which replaced the long form in previous census questionnaires, collects a wide range of data, such as incomes and commute times.

The ACS questionnaire will be mailed out in late December and will contain directions on how to complete the survey online.

The mobile app can be downloaded for both iPhones and Android smartphones, as well as for tablets.

 

New high-tech airships taking shape

By W.J. Hennigan

Los Angeles Times

Posted: 09/06/2012 06:42:21 PM PDT

 

LOS ANGELES — Not since the waning days of World War II have the mammoth wooden blimp hangars at the former military base in Tustin, Calif., seen as much airship manufacturing work as they do today.

Inside the 17-story structures that rise above southern Orange County, Worldwide Aeros Corp. is building a blimp-like airship designed for the military to carry tons of cargo to remote areas around the world.

“Nobody has ever tried to do what we’re doing here,” Chief Executive Igor Pasternak said of the 265-foot skeleton being transformed into the cargo airship. “This will revolutionize airship technology.”

In recent years, the affordability of airships as well as developments in high-definition cameras, high-powered sensors and other unmanned technologies have turned these oddball aircraft from curiosities of a bygone era to must-have items for today’s military. And airships increasingly are being used for civilian purposes.

The federal government is buying blimps, zeppelins and spy balloons, and many of these new-generation hybrid “lighter than air” aircraft are taking shape across California.

“So much is going on with airships in California now,” Pasternak said. “It wasn’t this way 10 years ago.”

Pasternak’s Montebello, firm makes airships used for surveillance, advertising and transport. Lockheed Martin Corp. designs and builds airships for commercial use at its secretive Skunk Works facility in Palmdale, Calif.

Northrop Grumman Corp. does design work for airships around the Southland but is building them in Florida.

Although these steerable aircraft are sometimes known casually as blimps, there are differences. A blimp is shaped by the gas inside of it, whereas a zeppelin has a rigid skeleton inside. The helium-filled sky balloons, or aerostats, used over Afghanistan are neither blimps nor zeppelins. But they all fall under the term “airship.”

The importance of these next-generation airships became obvious to the Pentagon as increased use of drones highlighted the need for stationary aircraft that could provide constant surveillance, not just overhead flights for a few hours. That’s where these unmanned blimps came into play, with their ability to linger over an area for days at a time.

Using balloons, blimps and zeppelins in a war zone is not a new idea. The military used balloons for aerial observation posts during the Civil War, and the Germans used zeppelins to drop bombs on England during World War I. Massive blimps regularly patrolled the Pacific Ocean coastline looking for Japanese submarines and other warships during World War II.

After that, the military began opting for helicopters and sub-hunting aircraft, and the demand for blimps, balloons and other airships began to taper off. But they have played an expanded role in recent years in Mideast conflicts. Currently, there are more than 100 aerostats being used in Afghanistan, up from fewer than 10 in 2004.

Resembling small blimps, these aerostats are tethered to the ground and float thousands of feet above military bases and important roadways. They are big enough that gunfire below won’t take them down. Cameras on aerostats are similar to those on drones and can see for many miles at a fraction of the per-flight-hour cost of a drone. They’re also used to monitor the U.S.-Mexico border.

“It’s an affordable solution,” said Terry L. Mitchell, intelligence futures director at Army headquarters. “You can provide overwatch of the base or troops as they make their way on the ground.”

But these less-sophisticated aerostats don’t have nearly the size or the capability of the next-generation airships that are being designed and manufactured now.

Public perception of airships has been guarded ever since the giant Hindenburg burst into flames in 1937 in front of news cameras while mooring at Lakehurst, N.J. The explosion of the hydrogen-filled German zeppelin killed 36 people, shocked the public and deflated the chances of lighter-than-air ships becoming a popular mode of travel.

These days, airships are filled with nonflammable helium, but the Hindenburg tragedy remains vivid to many even today.

“It’s very easy to dismiss airship technology. There is a record of public failure, and it’s sometimes difficult to take seriously in the modern era,” said Bill Althoff, author of “Sky Ships: A History of the Airship in the United States Navy.”

Still, the new materials and technology used in today’s airships have greatly increased the vehicles’ capabilities, Althoff said. “The virtue of the platform has endured,” he said.

There is belief among those in the airship business that the technology can take on more civilian roles. Airships already have provided surveillance over the Mexican border for security and on disaster-control missions such as the 2010 BP oil spill on the Gulf Coast.

At Lockheed’s facility in Palmdale, work is underway on a 290-foot airship, called SkyTug, to be sold to the commercial market by Canada’s Aviation Capital Enterprises. The first SkyTug will be similar to a “super-sized helicopter” capable of carrying 20 tons of cargo.

Bob Boyd, Lockheed’s program manager for hybrid airships, said the company expects to start the Federal Aviation Administration certification process by the end of the year. A number of companies that need to get cargo to remote areas, such as oil and timber firms, are interested, he said.

It takes convincing that the airship is the answer to their problems.

 

“It’s hard for people to wrap their heads around,” Boyd said. “There are three ways to move cargo: by ship, by truck or by train. Suddenly there’s this fourth option.”

Lockheed has plans for a larger version, called SkyFreighter, that would be capable of carrying 70 tons of cargo, and an even larger version, called SkyLiner, that would measure 800 feet and be capable of lifting 500 tons of cargo.

Within three years, Boyd said, his company could be manufacturing as many as 30 airships a year.

“People don’t recognize it, but Southern California is the epicenter for hybrid airships,” he said. “We’ll certainly be filling the sky with something unusual in the coming years.”

 

 

Police chiefs urge limits on use of drones

AFTimes

By Kevin Johnson – USA Today

Posted : Thursday Sep 6, 2012 18:31:10 EDT

 

The nation’s largest consortium of police officials is calling for the limited use of unmanned drones in local law enforcement operations and urging that the controversial aircraft — now popular weapons on international battlefields — not be armed.

The first national advisory for the use of unmanned aircraft issued by the International Association of Chiefs of Police comes as federal lawmakers and civil rights advocates have expressed deep concerns about the vehicles’ use in domestic law enforcement, especially in aerial surveillance.

Only a handful of police agencies, including the Mesa County, Colo., Sheriff’s Department, are currently using unmanned aircraft. But Don Roby, chairman of the IACP’s aviation committee, said an increasing number of departments are considering unmanned aircraft for such things as search and rescue operations, traffic accident scene mapping and some surveillance activities.

In July, federal lawmakers, including Mississippi Rep. Bennie Thompson, the ranking Democrat on the House Homeland Security Committee, expressed concerns about the potential risk of arming the vehicles as they are being increasingly considered for use.

Some of the vehicles, Thompson said, have the capacity to “shoot (stun-gun) projectiles, tear gas and rubber balls from 300 feet above ground.”

Roby said the guidelines represent an “urgent” attempt to redefine the value of aerial drones away from battlefields of Afghanistan, Iraq and Yemen. “It’s very important that people understand that we won’t be up there with armed Predator drones firing away,” said Roby, who also is a Baltimore Police Department captain. “Everytime you hear someone talking about the use of these vehicles, it’s always in the context of a military operation. That’s not what we’re talking about.”

In cases in which a drone is to be used to collect evidence that would likely “intrude upon reasonable expectations of privacy,” the IACP’s new guidelines recommend that police secure search warrants prior to launching the vehicle.

On the question of arming drones, however, the IACP issued its most emphatic recommendation:

“Equipping the aircraft with weapons of any type is strongly discouraged. Given the current state of the technology, the ability to effectively deploy weapons from a small UA (un-manned aircraft) is doubtful … (and) public acceptance of airborne use of force is likewise doubtful and could result in unnecessary community resistance to the program.”

The American Civil Liberties Union said in a statement that it “applauded” the police group for “issuing recommendations that are quite strong in some areas.”

“At the same time, we don’t think these recommendations go far enough to ensure true protection of privacy from drones,” the ACLU said, adding that privacy protections needed to be enshrined in law “not merely promulgated by the police themselves.”

Some proposed legislation, including a bill proposed by Sen. Rand Paul, R-Ky., is calling for authorities to secure warrants before all uses, except in cases when the aircraft is being used to patrol the borders, when there is a threat of terror attack or in cases when life is threatened.

“Like other tools used to collect information in law enforcement, in order to use drones a warrant needs to be issued,” Paul said when introducing the legislation in June. “Americans going about their everyday lives should not be treated like criminals or terrorists and have their rights infringed upon by military tactics.”

Steve Ingley, executive director of the Airborne Law Enforcement Association, which advocates for the use of aviation in public safety missions, said it was necessary for police to respond quickly to the civil liberties concerns outlined in proposed legislation and by civil rights advocates.

“This (drone use) is a good potential tool for law enforcement,” Ingley said. “… But it’s important for people to know that this is not the predator. This is very different.”

Ben Gielow, general counsel of Association of Unmanned Vehicle Systems International, which represents manufacturers, said the aircraft used by police would be miniature counterparts to the drones used by the military and CIA.

The police drones, he said, would likely weigh as little as five pounds and could represent a more affordable aviation option at a cost of $30,000 to $50,000, rather than a $3 million helicopter.

“There is still a lot of education that needs to take place to determine how this can be used domestically,” Gielow said.

 

White House circulating draft of executive order on cybersecurity

The Hill

By Jennifer Martinez – 09/06/12 07:56 PM ET


The White House is circulating a draft of an executive order aimed at protecting the country from cyberattacks, The Hill has learned.

The draft proposal, which has been sent to relevant federal agencies for feedback, is a clear sign that the administration is resolved to take action on cybersecurity even as Congress remains gridlocked on legislation that would address the threat.

The draft executive order would establish a voluntary program where companies operating critical infrastructure would elect to meet cybersecurity best practices and standards crafted, in part, by the government, according to two people familiar with the document.

The concept builds off of a section in the cybersecurity bill from Sen. Joe Lieberman (I-Conn.) that was blocked last month by Senate Republicans, who called it a backdoor to new regulations.

The draft has undergone multiple revisions and is brief, spanning no more than five pages. It is still being worked on and is subject to change, the people familiar with the draft stressed.

It’s also unclear whether the final product will get the president’s approval to move forward.

A new draft of the executive order is expected to be shared with agencies next week.

White House counterterrorism adviser John Brennan first floated the idea of an executive order in a speech a few days after the Senate bill failed. He said the White House would consider taking action on the executive level to ensure key infrastructure such as the power grid, water supply and transportation networks are secure.

The momentum for cybersecurity legislation in Congress weakened after Lieberman’s bill failed to clear the Senate. Now industry groups and Congress are watching the White House for clues about what might be included in a executive order on cybersecurity.

A spokeswoman for the White House declined to comment on whether a draft for a executive order was being circulated, but said it is one of the options the administration is weighing.

“An executive order is one of a number of measures we’re considering as we look to implement the president’s direction to do absolutely everything we can to better protect our nation against today’s cyberthreats,” said White House spokeswoman Caitlin Hayden. “We are not going to comment on ongoing internal deliberations.”

Sponsors of Lieberman’s bill have urged the White House to issue an executive order to put measures in place that ensure key infrastructure is better protected from cyberattacks. Sens. Jay Rockefeller (D-W.Va.) and Dianne Feinstein (D-Calif.) both sent letters to the White House last month that urged the president to take action.

According to the people familiar with the draft, the executive order would set up an inter-agency council that would be led by the Department of Homeland Security (DHS). Members of the council would include the Department of Defense and the Commerce Department, and discussions are ongoing about including other agencies and officials, such as representatives from the Department of Energy and Treasury Department, as well as the attorney general and the director of national intelligence.

DHS would be responsible for the overall management of the program, but the Commerce Department’s National Institute of Standards and Technology (NIST) would work with industry to help craft the framework for it. The agency would work with the private sector to develop cybersecurity guidelines and best practices.

DHS would receive the guidance from NIST and work with so-called sector coordinating councils to identify which industry sectors would be considered critical infrastructure, as well as determine what cybersecurity best practices and standards the industry participants in the voluntary program would follow.

Those coordinating councils are already in place, and fall under an arm of DHS that manages critical infrastructure protection. The councils are run and organized by industry members from each sector, such as financial services and electricity.

It would be left up to the companies to decide what steps they want to take to meet the standards, so the government would not dictate what type of technology or strategy they should adopt.

One of the main issues still under discussion involves the kinds of incentives the government will offer critical infrastructure operators to entice them into the program.

The executive branch is limited when it comes to the types of incentives it can offer companies, as much of that authority rests with Congress. For instance, the executive branch is barred from offering companies liability protection if they face lawsuits after a security breach.

“For many of these incentives, you need new legislative authority,” said James Lewis, director of the technology and public policy program at the Center for Strategic and International Studies, who has not seen a copy of the draft executive order.

To get industry participation in the program, Lewis argues that it’s key for the inter-agency council to include agencies that already regulate critical infrastructure, such as the Federal Energy Regulatory Commission.

Lewis also fears that it would take the government too long to get the voluntary program in the executive order up and running.

“The White House needs to step back and say, ‘Does this make a meaningful contribution in the near term?’ ” Lewis said.

Additionally, he cautioned that industry would balk at electing to join a program led by DHS, which is plagued with a spotty track record when it comes to leading national security efforts.

“Find me a company that says ‘I’m going to voluntarily agree to be regulated by DHS.’ Nobody is going to volunteer to have DHS regulate them,” Lewis said.

Source:
http://thehill.com/blogs/hillicon-valley/technology/248079-white-house-circulating-draft-of-executive-order-on-cybersecurity

 

Democratic platform diverges with GOP on cybersecurity

The Hill

By Jennifer Martinez 09/04/12 02:16 PM ET


The Democratic platform released late Monday touted “the unprecedented steps” the Obama administration has taken to defend the United States from cyber attacks and vowed to continue strengthening the nation’s cyber defenses.

Democrats noted that the first military command focused on cybersecurity, known as U.S. Cyber Command, was set up under President Obama’s watch and that the administration has completed a full review of the federal government’s efforts to protect key infrastructure and data from cyber threats. The Democratic Party said it would continue this push to boost the security of the nation’s critical computer systems and networks from hackers, terrorist networks and hostile countries looking to wreak damage against infrastructure that’s key to public safety and the economy.

“We will continue to take steps to deter, prevent, detect, and defend against cyber intrusions by investing in cutting-edge research and development, promoting cybersecurity awareness and digital literacy, and strengthening private sector and international partnerships,” the platform reads.

It’s a far cry from the GOP platform approved at the party’s convention last week. In their cybersecurity plank, Republicans argued that Obama’s approach to cybersecurity has been too regulatory and reliant on defensive capabilities.

The Democratic platform, which is slated to be approved this week, states that the president has pushed for comprehensive cybersecurity legislation that strikes a balance between computer security and the protection of Americans’ privacy rights. The White House has taken a tough stance against legislation that it believes will upset this balance and violate privacy and civil liberties. Notably, it threatened to veto the Cyber Intelligence Sharing and Protection Act before it cleared the House this past spring.

It also states that “going forward, the President will continue to take executive action to strengthen and update our cyber defenses.”

After the Senate stalled on cybersecurity legislation last month, the White House is considering drafting an executive order that would put cybersecurity measures in place to better protect critical infrastructure from cyber attacks. However, the administration has kept quiet about what those possible measures would look like.

The GOP would likely bash such a cybersecurity executive order. In their platform, Republicans cautioned that the Obama administration’s cybersecurity approach would add another layer of bureaucracy and costs to an already bloated federal government and ultimately harm innovation.

Republicans also argued in their platform that improving the sharing of information about cyber threats between government and industry was the best way to boost the nation’s cybersecurity. However, the Obama administration has repeatedly said that it believes information sharing alone is not enough to defend against sophisticated cyber threats, and argued that companies operating key infrastructure should meet some sort of cybersecurity standards.

Source:
http://thehill.com/blogs/hillicon-valley/technology/247391-democratic-platform-diverges-with-gop-on-cybersecurity

 

 

The Weatherman Is Not a Moron

NY Times

Kevin Erskine

September 7, 2012

 

From the inside, the National Centers for Environmental Prediction looked like a cross between a submarine command center and a Goldman Sachs trading floor. Twenty minutes outside Washington, it consisted mainly of sleek workstations manned by meteorologists working an armada of flat-screen monitors with maps of every conceivable type of weather data for every corner of the country. The center is part of the National Weather Service, which Ulysses S. Grant created under the War Department. Even now, it remains true to those roots. Many of its meteorologists have a background in the armed services, and virtually all speak with the precision of former officers.

 

They also seem to possess a high-frequency-trader’s skill for managing risk. Expert meteorologists are forced to arbitrage a torrent of information to make their predictions as accurate as possible. After receiving weather forecasts generated by supercomputers, they interpret and parse them by, among other things, comparing them with various conflicting models or what their colleagues are seeing in the field or what they already know about certain weather patterns — or, often, all of the above. From station to station, I watched as meteorologists sifted through numbers and called other forecasters to compare notes, while trading instant messages about matters like whether the chance of rain in Tucson should be 10 or 20 percent. As the information continued to flow in, I watched them draw on their maps with light pens, painstakingly adjusting the contours of temperature gradients produced by the computers — 15 miles westward over the Mississippi Delta or 30 miles northward into Lake Erie — in order to bring them one step closer to accuracy.

These meteorologists are dealing with a small fraction of the 2.5 quintillion bytes of information that, I.B.M. estimates, we generate each day. That’s the equivalent of the entire printed collection of the Library of Congress about three times per second. Google now accesses more than 20 billion Web pages a day; the processing speed of an iPad rivals that of last generation’s most powerful supercomputers. All that information ought to help us plan our lives and profitably predict the world’s course. In 2008, Chris Anderson, the editor of Wired magazine, wrote optimistically of the era of Big Data. So voluminous were our databases and so powerful were our computers, he claimed, that there was no longer much need for theory, or even the scientific method. At the time, it was hard to disagree.

But if prediction is the truest way to put our information to the test, we have not scored well. In November 2007, economists in the Survey of Professional Forecasters — examining some 45,000 economic-data series — foresaw less than a 1-in-500 chance of an economic meltdown as severe as the one that would begin one month later. Attempts to predict earthquakes have continued to envisage disasters that never happened and failed to prepare us for those, like the 2011 disaster in Japan, that did.

The one area in which our predictions are making extraordinary progress, however, is perhaps the most unlikely field. Jim Hoke, a director with 32 years experience at the National Weather Service, has heard all the jokes about weather forecasting, like Larry David’s jab on “Curb Your Enthusiasm” that weathermen merely forecast rain to keep everyone else off the golf course. And to be sure, these slick-haired and/or short-skirted local weather forecasters are sometimes wrong. A study of TV meteorologists in Kansas City found that when they said there was a 100 percent chance of rain, it failed to rain at all one-third of the time.

But watching the local news is not the best way to assess the growing accuracy of forecasting (more on this later). It’s better to take the long view. In 1972, the service’s high-temperature forecast missed by an average of six degrees when made three days in advance. Now it’s down to three degrees. More stunning, in 1940, the chance of an American being killed by lightning was about 1 in 400,000. Today it’s 1 in 11 million. This is partly because of changes in living patterns (more of our work is done indoors), but it’s also because better weather forecasts have helped us prepare.

Perhaps the most impressive gains have been in hurricane forecasting. Just 25 years ago, when the National Hurricane Center tried to predict where a hurricane would hit three days in advance of landfall, it missed by an average of 350 miles. If Hurricane Isaac, which made its unpredictable path through the Gulf of Mexico last month, had occurred in the late 1980s, the center might have projected landfall anywhere from Houston to Tallahassee, canceling untold thousands of business deals, flights and picnics in between — and damaging its reputation when the hurricane zeroed in hundreds of miles away. Now the average miss is only about 100 miles.

Why are weather forecasters succeeding when other predictors fail? It’s because long ago they came to accept the imperfections in their knowledge. That helped them understand that even the most sophisticated computers, combing through seemingly limitless data, are painfully ill equipped to predict something as dynamic as weather all by themselves. So as fields like economics began relying more on Big Data, meteorologists recognized that data on its own isn’t enough.

The I.B.M. Bluefire supercomputer in the basement of the National Center for Atmospheric Research in Boulder, Colo., is so large that it essentially creates its own weather. The 77 trillion calculations that Bluefire makes each second, in its mass of blinking lights and coaxial cable, generate so much radiant energy that it requires a liquid cooling system. The room where Bluefire resides is as drafty as a minor-league hockey rink, and it’s loud enough that hearing protection is suggested.

The 11 cabinets that hold the supercomputer are long and narrow and look like space-age port-a-potties. When I mentioned this to Rich Loft, the director of technology development for NCAR, he was not amused. To him, this computer represents the front line in an age-old struggle to predict our environment. “You go back to Chaco Canyon or Stonehenge,” Loft said, “and people realized they could predict the shortest day of the year and the longest day — that the moon moved in predictable ways. But there are things an ancient man couldn’t predict: ambush from some kind of animal, a flash flood or a thunderstorm.”

For centuries, meteorologists relied on statistical tables based on historical averages — it rains about 45 percent of the time in London in March, for instance — to predict the weather. But these statistics are useless on a day-to-day level. Jan. 12, 1888, was a relatively warm day on the Great Plains until the temperature dropped almost 30 degrees in a matter of hours and a blinding snowstorm hit. More than a hundred children died of hypothermia on their way home from school that day. Knowing the average temperature for a January day in Topeka wouldn’t have helped much in a case like that.

The holy grail of meteorology, scientists realized, was dynamic weather prediction — programs that simulate the physical systems that produce clouds and cold fronts, windy days in Chicago and the morning fog over San Francisco as they occur. Theoretically, the laws that govern the physics of the weather are fairly simple. In 1814, the French mathematician Pierre-Simon Laplace postulated that the movement of every particle in the universe should be predictable as long as meteorologists could know the position of all those particles and how fast they are moving. Unfortunately, the number of molecules in the earth’s atmosphere is perhaps on the order of 100 tredecillion, which is a 1 followed by 44 zeros. To make perfect weather predictions, we would not only have to account for all of those molecules, but we would also need to solve equations for all 100 tredecillion of them at once.

The most intuitive way to simplify the problem was to break the atmosphere down into a finite series of boxes, or what meteorologists variously refer to as a matrix, a lattice or a grid. The earliest credible attempt at this, according to Loft, was made in 1916 by an English physicist named Lewis Fry Richardson, who wanted to determine the weather over northern Germany on May 20, 1910. This was not technically a prediction, because the date was some six years in the past, but Richardson treated it that way, and he had a lot of data: a series of observations of temperature, barometric pressures and wind speeds that had been gathered by the German government. And as a pacifist serving a volunteer ambulance unit in northern France, he also had a lot of time on his hands. So Richardson broke Germany down into a series of two-dimensional boxes, each measuring three degrees of latitude by three degrees of longitude. Then he went to work trying to solve the equations that governed the weather in each square and how they might affect weather in the adjacent ones.

Richardson’s experiment failed miserably. It “predicted” a dramatic rise in barometric pressure that hadn’t occurred and produced strange-looking weather patterns that didn’t resemble any seen in Germany before or since. Had he made a computational error? Were his equations buggy? It was hard to say. Even the most devoted weather nerds weren’t eager to solve differential equations for months on end to double-check his work for one day in one country six years in the past.

What Richardson needed, he thought, was more manpower. He envisioned a weather-forecasting center with some 64,000 meteorologists, all working simultaneously to have the computational speed to make accurate weather forecasts in real time. His dream came to fruition (sort of) in 1950, when the first computer weather forecast was tried by the mathematician John von Neumann and a team of scientists at the Institute for Advanced Study in Princeton, N.J. They used a machine that could make about 5,000 calculations a second, which was quite possibly as fast as 64,000 men. Alas, 5,000 calculations a second was no match for the weather. As it turned out, their forecast wasn’t much better than a random guess.

Our views about predictability are inherently flawed. Take something that is often seen as the epitome of randomness, like a coin toss. While it may at first appear that there’s no way to tell whether a coin is going to come up heads or tails, a group of mathematicians at Stanford is able to predict the outcome virtually 100 percent of the time, provided that they use a special machine to flip it. The machine does not cheat — it flips the coin the exact same way (the same height, with the same strength and torque) over and over again — and the coin is fair. Under those conditions, there is no randomness at all.

The reason that we view coin flips as unpredictable is because when we toss them, we’re never able to reproduce the exact same motion. A similar phenomenon applies to the weather. In the late 1950s, the renowned M.I.T. mathematician Edward Lorenz was toiling away in his original profession as a meteorologist. Then, in the tradition of Alexander Fleming and penicillin or the New York Knicks and Jeremy Lin, he made a major discovery purely by accident. At the time, Lorenz and his team were trying to advance the use of computer models in weather prediction. They were getting somewhere, or so they thought, until the computer started spitting out contradictory results. Lorenz and his colleagues began with what they believed were exactly the same data and ran what they thought was exactly the same code; still, the program somehow forecast clear skies over Colorado in one run and a thunderstorm in the next.

After spending weeks double-checking their hardware and trying to debug their code, Lorenz and his team discovered that their data weren’t exactly the same. The numbers had been rounded off in the third decimal place. Instead of having the barometric pressure in one corner of their grid read 29.5168, for example, it might instead read 29.517. This couldn’t make that much of a difference, could it? Actually, Lorenz realized, it could, and he devoted the rest of his career to studying strange behaviors like these by developing a branch of mathematics called chaos theory, the most basic tenet of which is described in the title of his breakthrough 1972 paper, “Predictability: Does the Flap of a Butterfly’s Wings in Brazil Set Off a Tornado in Texas?” In other words, a small change in initial conditions can produce a large and unexpected divergence in outcomes.

Chaos theory does not imply that the behavior of the system is literally random. It just means that certain types of systems are very hard to predict. If you know the exact conditions of a coin as it leaves someone’s hand, you can — with the right laboratory equipment — predict, almost perfectly, which side it will land on. And yet the slightest disturbance to that motion can change a coin toss from being almost wholly predictable to almost wholly unpredictable.

The problem with weather is that our knowledge of its initial conditions is highly imperfect, both in theory and practice. A meteorologist at the National Oceanic and Atmospheric Administration told me that it wasn’t unheard-of for a careless forecaster to send in a 50-degree reading as 500 degrees. The more fundamental issue, though, is that we can observe our surroundings with only a certain degree of precision. No thermometer is perfect, and it isn’t physically possible to stick one into every molecule in the atmosphere.

Weather also has two additional properties that make forecasting even more difficult. First, weather is nonlinear, meaning that it abides by exponential rather than by arithmetic relationships. Second, it’s dynamic — its behavior at one point in time influences its behavior in the future. Imagine that we’re supposed to be taking the sum of 5 and 5, but we keyed in the second number as 6 by mistake. That will give us an answer of 11 instead of 10. We’ll be wrong, but not by much; addition, as a linear operation, is pretty forgiving. Exponential operations, however, extract a lot more punishment when there are inaccuracies in our data. If instead of taking 55 — which should be 3,125 — we instead take 56, we wind up with an answer of 15,625. This problem quickly compounds when the process is dynamic, because outputs at one stage of the process become our inputs in the next.

Given how daunting the challenge was, it must have been tempting to give up on the idea of building a dynamic weather model altogether. A thunderstorm might have remained roughly as unpredictable as an earthquake. But by embracing the uncertainty of the problem, their predictions started to make progress. “What may have distinguished [me] from those that proceeded,” Lorenz later reflected in “The Essence of Chaos,” his 1993 book, “was the idea that chaos was something to be sought rather than avoided.”

Perhaps because chaos theory has been a part of meteorological thinking for nearly four decades, professional weather forecasters have become comfortable treating uncertainty the way a stock trader or poker player might. When weather.gov says that there’s a 20 percent chance of rain in Central Park, it’s because the National Weather Service recognizes that our capacity to measure and predict the weather is accurate only up to a point. “The forecasters look at lots of different models: Euro, Canadian, our model — there’s models all over the place, and they don’t tell the same story,” Ben Kyger, a director of operations for the National Oceanic and Atmospheric Administration, told me. “Which means they’re all basically wrong.” The National Weather Service forecasters who adjusted temperature gradients with their light pens were merely interpreting what was coming out of those models and making adjustments themselves. “I’ve learned to live with it, and I know how to correct for it,” Kyger said. “My whole career might be based on how to interpret what it’s telling me.”

Despite their astounding ability to crunch numbers in nanoseconds, there are still things that computers can’t do, contends Hoke at the National Weather Service. They are especially bad at seeing the big picture when it comes to weather. They are also too literal, unable to recognize the pattern once it’s subjected to even the slightest degree of manipulation. Supercomputers, for instance, aren’t good at forecasting atmospheric details in the center of storms. One particular model, Hoke said, tends to forecast precipitation too far south by around 100 miles under certain weather conditions in the Eastern United States. So whenever forecasters see that situation, they know to forecast the precipitation farther north.

But there are literally countless other areas in which weather models fail in more subtle ways and rely on human correction. Perhaps the computer tends to be too conservative on forecasting nighttime rainfalls in Seattle when there’s a low-pressure system in Puget Sound. Perhaps it doesn’t know that the fog in Acadia National Park in Maine will clear up by sunrise if the wind is blowing in one direction but can linger until midmorning if it’s coming from another. These are the sorts of distinctions that forecasters glean over time as they learn to work around potential flaws in the computer’s forecasting model, in the way that a skilled pool player can adjust to the dead spots on the table at his local bar.

Among the National Weather Service’s detailed records is a thorough comparison of how well the computers are doing by themselves alongside the value that humans are contributing. According to the agency’s statistics, humans improve the accuracy of precipitation forecasts by about 25 percent over the computer guidance alone. They improve the temperature forecasts by about 10 percent. Humans are good enough, in fact, that when the organization’s Cray supercomputer burned down, in 1999, their high-temperature forecasts remained remarkably accurate. “You almost can’t have a meeting without someone mentioning the glory days of the Cray fire,” Kyger said, pointing to a mangled, half-burnt piece of the computer that was proudly displayed in the office where I met him. “If you weren’t here for that, you really weren’t part of the brotherhood.”

Still, most people take their forecasts for granted. Like a baseball umpire, a weather forecaster rarely gets credit for getting the call right. Last summer, meteorologists at the National Hurricane Center were tipped off to something serious when nearly all their computer models indicated that a fierce storm was going to be climbing the Northeast Corridor. The eerily similar results between models helped the center amplify its warning for Hurricane Irene well before it touched down on the Atlantic shore, prompting thousands to evacuate their homes. To many, particularly in New York, Irene was viewed as a media-manufactured nonevent, but that was largely because the Hurricane Center nailed its forecast. Six years earlier, the National Weather Service also made a nearly perfect forecast of Hurricane Katrina, anticipating its exact landfall almost 60 hours in advance. If public officials hadn’t bungled the evacuation of New Orleans, the death toll might have been remarkably low.

In a time when forecasters of all types make overconfident proclamations about political, economic or natural events, uncertainty is a tough sell. It’s much easier to hawk overconfidence, no matter if it’s any good. A long-term study of political forecasts conducted by Philip Tetlock, a professor at the University of Pennsylvania, found that when political experts described an event as being absolutely certain, it failed to transpire an astonishing 25 percent of the time.

The Weather Service has struggled over the years with how much to let the public in on what it doesn’t exactly know. In April 1997, Grand Forks, N.D., was threatened by the flooding Red River, which bisects the city. Snowfall had been especially heavy in the Great Plains that winter, and the service, anticipating runoff as the snow melted, predicted that the Red would crest to 49 feet, close to the record. Because the levees in Grand Forks were built to handle a flood of 52 feet, a small miss in the forecast could prove catastrophic. The margin of error on the Weather Service’s forecast — based on how well its flood forecasts had done in the past — implied about a 35 percent chance of the levees’ being topped.

The waters, in fact, crested to 54 feet. It was well within the forecast’s margin of error, but enough to overcome the levees and spill more than two miles into the city. Cleanup costs ran into the billions of dollars, and more than 75 percent of the city’s homes were damaged or destroyed. Unlike a hurricane or an earthquake, the Grand Forks flood may have been preventable. The city’s flood walls could have been reinforced using sandbags. It might also have been possible to divert the overflow into depopulated areas. But the Weather Service had explicitly avoided communicating the uncertainty in its forecast to the public, emphasizing only the 49-foot prediction. The forecasters later told researchers that they were afraid the public might lose confidence in the forecast if they had conveyed any uncertainty.

Since then, the National Weather Service has come to recognize the importance of communicating the uncertainty in its forecasts as completely as possible. “Uncertainty is the fundamental component of weather prediction,” said Max Mayfield, an Air Force veteran who ran the National Hurricane Center when Katrina hit. “No forecast is complete without some description of that uncertainty.” Under Mayfield’s guidance, the National Hurricane Center began to pay much more attention to how it presents its forecasts. Instead of just showing a single track line for a hurricane’s predicted path, their charts prominently feature a cone of uncertainty, which many in the business call “the cone of chaos.”

Unfortunately, this cautious message can be undercut by private-sector forecasters. Catering to the demands of viewers can mean intentionally running the risk of making forecasts less accurate. For many years, the Weather Channel avoided forecasting an exact 50 percent chance of rain, which might seem wishy-washy to consumers. Instead, it rounded up to 60 or down to 40. In what may be the worst-kept secret in the business, numerous commercial weather forecasts are also biased toward forecasting more precipitation than will actually occur. (In the business, this is known as the wet bias.) For years, when the Weather Channel said there was a 20 percent chance of rain, it actually rained only about 5 percent of the time.

People don’t mind when a forecaster predicts rain and it turns out to be a nice day. But if it rains when it isn’t supposed to, they curse the weatherman for ruining their picnic. “If the forecast was objective, if it has zero bias in precipitation,” Bruce Rose, a former vice president for the Weather Channel, said, “we’d probably be in trouble.”

The National Weather Service, on the other hand, takes plenty of blame when its cautious forecasts seem retrospectively unwarranted. I was reminded of this when I arrived in Tampa for the Republican National Convention. The city was briefly in Hurricane Isaac’s cone of chaos before the storm took a westward tack. The airport and roads were remarkably quiet, no doubt because some reporters and delegates (and thousands of tourists) heeded caution and stayed home. When I so much as mentioned the weather forecast, my taxi driver turned and launched into a series of obscenities.

This article is adapted from “The Signal and the Noise: Why So Many Predictions Fail — but Some Don’t,” to be published this month by Penguin Press.

Kevin Erskine is a professional pseudonym used by the Dutch photographer Erik Hijweege.

Nate Silver runs the FiveThirtyEight blog. His book on forecasting and prediction, “The Signal and the Noise,” will be published this month.

Editor: Jon Kelly

Advertisements

From → Uncategorized

Leave a Comment

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

%d bloggers like this: