Skip to content

March 24, 2012

March 26, 2012




Washington Post

March 19, 2012

U.S. accelerating cyberweapon research

By Ellen Nakashima,

The Pentagon is accelerating efforts to develop a new generation of cyberweapons capable of disrupting enemy military networks even when those networks are not connected to the Internet, according to current and former U.S. officials.

The possibility of a confrontation with Iran or Syria has highlighted for American military planners the value of cyberweapons that can be used against an enemy whose most important targets, such as air defense systems, do not rely on Internet-based networks. But adapting such cyberweapons can take months or even years of arduous technical work.

When U.S. military planners were looking for ways to disable Libya’s air defense system before NATO’s aerial attacks last year, they discussed using cybertechnology. But the idea was quickly dismissed because no effective option was available, said current and former U.S. officials.

They estimated that crafting a cyberweapon would have taken about a year, including the time needed to assess the target system for vulnerabilities.

“We weren’t ready to do that in Libya,” said a former U.S. official, who spoke on the condition of anonymity because of the sensitivity of the discussions. “We’re not ready to do that now, either.”

Last year, to speed up the development of cyberweapons, as well as defensive technology, then-Deputy Defense Secretary William J. Lynn III and Marine Corps Gen. James Cartwright, then vice chairman of the Joint Chiefs of Staff, placed $500 million over five years into the budget of the Defense Advanced Research Projects Agency, one of the Defense Department’s premier research organizations.

The agency also has launched new cyber-development initiatives, including a “fast-track” program.

“We need cyber options that can be executed at the speed, scale and pace” of other military weapons, Kaigham J. Gabriel, DARPA deputy director, said in testimony last month to Congress.

Pentagon officials, meanwhile, are developing a congressionally mandated strategy for the rapid acquisition of cyberweapons that can keep pace with threats and technology.

Officials are researching cyberweapons that can target “offline” military systems in part by harnessing emerging technology that uses radio signals to insert computer coding into networks remotely.

“To affect a system, you have to have access to it, and we have not perfected the capability of reaching out and accessing a system at will that is not connected to the Internet,” said Joel Harding, an independent consultant who is a former military officer and former director of the Information Operations Institute.

Even if an operator gains access, he said, “unless you already have custom-written code for a system, chances are we don’t have a weapon for that because each system has different software and updates.”

In some cases, as with command-and-control systems, military assets rely on Internet connections, making them theoretically easier to target.

Without that connectivity, an attacker would have to rely on other means — for instance, physically inserting into those systems portable devices such as thumb drives or computer components that have been altered.

But such approaches lack the control and predictability that military commanders desire, experts say.

The amount of disclosed spending by the Pentagon on cybersecurity and cybertechnology — offensive and defensive — is $3.4 billion this year. The U.S. Cyber Command, based at Fort Meade, was created in 2010 and has a budget of $154 million this year.

U.S. officials say that existing cyberweaponry has the potential to disable components of a weapon system, although it is not likely to destroy the system.

Cyber tools might be used in conjunction with other tactics and weapons. Cybertechnology might, for example, enable an attack by delaying enemy recognition of it until it is underway.

“It will probably never be just a standalone cyberattack on a network,” said Lt. Gen. Charles R. Davis, commander of the Electronic Systems Center at Hanscom Air Force Base, who buys the tools and software that support the Air Force’s offensive and defensive cyber activities.

Cybertechnology was not a significant factor in military operations 10 years ago, Gen. Martin Dempsey, chairman of the Joint Chiefs of Staff, said during an Atlantic Council discussion in December. “Cyber is a significant factor today.”

In Iraq, during the 2007 surge of U.S. combat forces, the National Security Agency used cyber tools to muddle the signals of the cellphones and laptop computers that insurgents used to coordinate their strikes, according to previously published reports confirmed by former U.S. officials. U.S. cyber operators used those techniques to deceive the enemy with false information, in some cases leading fighters into an ambush by U.S. troops.

But countering Libya’s air defenses was a different story. The operation arose quickly. Officials had not foreseen the Arab Spring uprising against Libyan strongman Moammar Gaddafi, and no intelligence and engineering work had been done to exploit the vulnerabilities of the Libyan air defense system.

Some experts believe that Israel may have used a cyberweapon to blind Syrian radar before bombing a suspected nuclear facility in September 2007, but several former U.S. officials say that the technique more likely used was conventional electronic warfare or radar jamming using signals emitted from an airplane.

The Stuxnet computer virus that reportedly disabled some 900 centrifuges in an Iranian uranium-enrichment plant in 2009 and 2010 — while it has been dubbed by control-system expert Ralph Langner as the world’s “first digital warhead” — lacked the precision, predictability and control that a military commander would need during combat, experts said.

“If I’m trying to knock down an air defense system, I have to know precisely what’s going to happen and when it will happen,” said a former military official. “It’s a fundamentally different approach than Stuxnet.”

DARPA plans to focus an increasing portion of its cyber research on “offensive capabilities to address military-specific needs,” Gabriel said recently in testimony before the House Armed Services subcommittee on emerging threats and capabilities.

Over the past decade, instances have been reported in which cyber tools were contemplated but not used because of concern they would result in collateral damage. For instance, defense and intelligence agencies discussed using cybertechnology to freeze money in Iraqi dictator Saddam Hussein’s bank accounts just before the U.S.-led invasion in March 2003 to blunt his efforts to mount a defense. The plan was aborted because of concern that the cyberattack could disrupt financial systems in Europe and beyond.

Within a war zone, the use of a cyberweapon may be limited by other considerations. There is the danger of collateral damage to civilian systems, such as disrupting a power supply to a hospital. A destructive computer code, once released, could be reverse-engineered and sent back at vulnerable U.S. targets or adapted for use by foreign spy agencies. Cybertechnology also is not always the most efficient way to attack a target — sometimes bombs or electronic warfare are easier or more reliable.

Within the Pentagon, more money is being spent on defending against cyberattacks than on preparing to deploy offensive cyber operations, officials say. That is appropriate, they say, when adversaries are trying to develop similar cyberweapons to use against U.S. military targets that may not be secure against attack and when Pentagon networks are probed thousands of times daily.

But more money needs to be spent on developing cyperweapons, say some former officials. “You’ve got to start moving investment to the offensive side,” Cartwright said.

Pentagon spending on cybertechnology is growing even as other areas of its budget are shrinking, officials say.

“I am still not remotely satisfied with where we are in cyber,” Deputy Secretary of Defense Ashton B. Carter said at the Credit Suisse and McAleese and Associates defense conference in Arlington this month.

“I dare say,” he said, “we’d spend a lot more if we could figure out where to spend it.”


Defense News

CYBERCOM Arming U.S. Combatant Commands

Mar. 21, 2012 – 04:15PM |



Once entirely controlled by the U.S. National Security Agency (NSA), offensive cyber weapons are making their way into the hands of the U.S. military’s geographic combatant commanders.

The effort was alluded to by the NSA and the U.S. Cyber Command (CYBERCOM) chief, Army Gen. Keith Alexander, as part of congressional testimony March 20, and confirmed by sources. It means that combatant commanders will be able to employ the weapons as part of overall mission planning, pairing traditional kinetic attacks with newly developed cyber capabilities.

CYBERCOM will establish Cyber Support Elements (CSEs) at all six geographic combatant commands, Alexander said in a written statement delivered to the House Armed Services emerging threats and capabilities subcommittee as part of a routine budget hearing.

Thus far, U.S. Central Command is the only command with a fully operational deployment, while U.S. Pacific Command (PACOM) has a partial deployment, a CYBERCOM spokesman confirmed.

These support elements will provide both technical capability and expertise, part of an effort to improve the integration of cyber attack capabilities, a source with knowledge of the efforts said.

“We are currently working closely with two of the geographic combatant commanders,” Alexander wrote. “Our goal is to ensure that a commander with a mission to execute has a full suite of cyber-assisted options from which to choose, and that he can understand what effects they will produce for him.”

A CYBERCOM spokesman confirmed that these options include offensive capabilities as well as defensive capabilities designed to protect systems, but said the details of the offensive capabilities are classified.

A source with knowledge of the effort at PACOM said the process is in its infancy there, as the infrastructure is still being developed and the integration of CYBERCOM personnel into mission planning is still being determined.

Providing capability to combatant commanders will notably differ from the current operational structure, in which most commands must coordinate with CYBERCOM, which in turn deploys cyber capabilities.

Before CYBERCOM was stood up in 2010, offensive capabilities resided with the NSA. But the transition away from reliance on the intelligence agency and toward localized capability is a logical progression, said Chris Coleman, director of cybersecurity for the public sector at Cisco, as the NSA was never intended to engage in combat.


“The NSA is an intelligence agency, so the fact that they’re transitioning combat tools over to CYBERCOM and eventually the combatant commanders makes perfect sense,” he said. “It’s what they should be doing.”

Alexander did not refer to the effort during his oral testimony, nor was he asked about it by subcommittee members. Instead, Alexander described the continuing growth of cybersecurity threats, and the members focused on responsibility for protecting public companies, as the Department of Homeland Security is looking to shoulder more of this burden.


But in his written testimony, Alexander focused on the concept of deterrence through improved attack capability, while avoiding the terms offense or offensive. DoD officials have been reticent to use the terms, given some of the legal ambiguity surrounding the use of cyber weapons.

“Cyber Command exists to ensure that the President can rely on the information systems of the Department of Defense and has military options available for consideration when and if he needs to defend the nation in cyberspace,” he wrote. “Our capabilities represent key components of deterrence.”

Alexander testified with Teresa Takai, the Pentagon’s chief information officer, and Madelyn Creedon, the assistant secretary of defense for global strategic affairs. All three said DoD is in the process of developing rules of engagement for cyber and should conclude the process soon.

Alexander’s testimony also mentioned that CSEs are expected to be deployed at U.S. Africa Command and U.S. Southern Command within the next six months.



Microsoft Said to Finish Windows 8 in Summer, With October Debut

By Dina Bass – Mar 20, 2012

Microsoft Corp. (MSFT) will finish work on Windows 8 this summer, setting the stage for personal computers and tablets with the operating system to go on sale around October, according to people with knowledge of the schedule.

The initial rollout will include devices running Intel Corp. (INTC) and ARM Holdings Plc (ARM) chips, making good on Microsoft’s promise to support both standards, said the people, who declined to be named because the plans are confidential. In embracing ARM technology, Microsoft is using the same kind of processors as Apple Inc.’s iPad. Still, there will be fewer than five ARM devices in the debut, compared with more than 40 Intel machines.

The timing would let Microsoft target Christmas shoppers with the new software, which works with touch-screen devices as well as laptops and desktop PCs. The Redmond, Washington-based company, which hasn’t announced timing for the Windows 8 release, aims to take back sales lost to the iPad and reinvigorate the sluggish PC market. Apple (AAPL) released the third version of the iPad this month, posing an even stiffer challenge to Microsoft.

“If they miss the September-October time frame, they’re going to be stuck without being able to ship anything in 2012,”said Michael Gartenberg, an analyst at Stamford, Connecticut-based Gartner Inc. “The last thing Microsoft wants to have is a situation where there are no compelling Windows tablets at a time when the new iPad looks like it’s going to be a good seller for the holidays.”

April Event

Microsoft will host an event for its industry partners in early April, the people said. The company will spell out its release strategy for Windows 8, giving more details on timing and marketing, they said.

There will be fewer ARM-based devices in the rollout because Microsoft has tightly controlled the number and set rigorous quality-control standards, said one of the people. The new version of Windows will be the first to use ARM processors, which are most commonly found in smartphones. Windows 7, the current version, only works with Intel’s technology. Three of the Windows 8 ARM devices will be tablets, the people said.

Mark Martin, a spokesman for Microsoft, declined to comment, as did Intel’s Jon Carvill. Representatives of Nvidia Corp. (NVDA), Texas Instruments Inc. (TXN) and Qualcomm Inc. (QCOM), the three chip companies that are supplying ARM-based processors to the makers of the new computers, referred questions on the availability of devices to Microsoft.

More than 103 million tablet devices will be sold in 2012, with sales rising to 326.3 million in 2015, according to Gartner. For now, Apple remains dominant, accounting for two-thirds of the market in 2012, Gartner estimates. The company’s share will drop to 46 percent by 2015, while Microsoft’s percentage will climb to 11 percent.

Difficult Adjustment

Getting Windows 8 ready for Intel processors is the easier part — Windows has always run on Intel chips. The full version of Windows has never run on ARM products, meanwhile, so that process is taking longer. When Microsoft released a test version of Windows 8 for developers in September, the software only ran on Intel technology.

Steven Sinofsky, president of the Windows business, has said repeatedly that Microsoft plans to have both ARM and Intel-based systems available when Windows 8 is released.

“Our collective goal is for them to ship at the same time,” he said in an interview last month. “I wouldn’t be saying it’s a goal if I didn’t think we could do it.”

To contact the reporters on this story: Dina Bass in Seattle at

To contact the editor responsible for this story: Tom Giles at



Yes, We Can Contain Iran

By ruling out the possibility of deterring a nuclear Iran, President Obama is needlessly increasing the risks of a ruinous war.


Foreign Policy



U.S. President Barack Obama, under pressure from Israel and American conservativesto take a harder line on Iran, keeps insisting that “all options are on the table.” That’s a diplomatic way of saying that the United States is willing to use force to keep Iran from getting nuclear weapons.






To buttress this thinly veiled threat, however, Obama recently took one important option off the table: deterrence. In an interview with the Atlantic, he ruled out “containing” a nuclear Iran in the same way the United States has contained other unfriendly nuclear powers — by threatening the country with massive retaliation if it attacks us or our allies.

This is a significant — and needless — change in U.S. foreign policy. It raises the likelihood of war with Iran, despite Obama’s preference for a diplomatic solution. And launching air strikes on Tehran’s nuclear facilities would undercut America’s ability to play the long game in Iran by abetting a “Persian Spring” that could eventually topple the Islamic Republic.

No sane person wants to see Iran’s theocrats get their hands on nuclear weapons. Nonetheless, the United States didn’t attack the Soviet Union or “Red” China — far more formidable adversaries — to keep them from getting the bomb. Later, when India, Pakistan and North Korea barged into the nuclear club, U.S. leaders expressed their displeasure with political and economic sanctions rather than military attacks. And we are safer for it.

So why should Washington now regard Iran’s nuclear ambitions as a casus belli? Some say that going nuclear would embolden Iran’s rulers to make good on their threats to “wipe Israel off the map.” Obama, however, doesn’t subscribe to the “crazy mullah” theory — in the same interview with the Atlantic, he made the case that Iran’s leaders “care about the regime’s survival” and would make pragmatic decisions to avoid its destruction. Obama’s biggest fear is a nuclear arms race breaking out in the world’s most volatile region.

In an age of terrorism inspired by religious fanaticism, checking the spread of weapons of mass destruction is a vital U.S. and global interest. But you’d think that, having just extricated the United States from Iraq, this administration would be leery of using nonproliferation as a rationale for another U.S. intervention in the Middle East.

By taking deterrence off the table, Obama is upping the stakes in this confrontation. He is saying, in effect, that the United States can’t live with a nuclear-armed Iran. This may have the tactical effect of turning up the heat on Tehran, but it also paints the United States into a corner. If diplomatic and economic pressures fail to curb Iran’s nuclear ambitions, Obama will be left with no option but to use force, or see his bluff called and America’s credibility shattered.

Somehave interpreted Obama’s “no-containment” stance as a sop to Israel. Prime Minister Benjamin Netanyahu, warning that the clock is running out on stopping Iran’s nuclear program, pressed Obama last week to define clear “red lines” and deadlines for action against Tehran. What he got instead was Obama’s assurances that the United States aims to prevent Iran from making nuclear weapons, not to contain it afterwards — along with admonitions to give tightening economic sanctions more time to work. Meanwhile, American conservatives complained that Obama’s real strategy is to forestall an Israeli attack on Iran before the November presidential election.

Meanwhile, the GOP presidential aspirants (except the resolutely non-interventionist Ron Paul) have been whipping up war fever. They accuse Obama of being soft on Iran — “feckless,” writes the nouveau hard-liner Mitt Romney — and demand that he issue ultimatums to Tehran to surrender or take a pounding. Fortunately, there’s zero evidence that Americans are pining for a return to George W. Bush’s style of unilateral belligerence. On the contrary, the public gives Obama high marks for resetting U.S. relations with the rest of the world.

Still, Obama is sensitive to GOP claims that he’s been insufficiently supportive of Israel. In his speech this month before the American Israel Public Affairs Committee, he pointed to his success in orchestrating an international campaign to deprive Iran of access to the global financial system and reduce its oil exports. Tehran has vowed it won’t bow to economic pressure, even as its currency craters. Nonetheless, the regime last week agreed to reopen talks with the world’s major powers aimed at reaching a political settlement.

It’s quite possible the mullahs are stalling for time. In any case, the United States shouldn’t limit its options for dealing with Tehran. The history of nuclear proliferation shows that the United States has never forcibly stopped another country from going nuclear. U.S. airstrikes could set back Iran’s enrichment program, but America can’t stand watch over the country in perpetuity. What’s more, a U.S. attack could unite the regime and the opposition Green Movement, which also insists on Iran’s right to develop civilian nuclear energy.

This might be the worst outcome of all. In the long run, the best bet for defusing the threats posed by a nuclear Iran is a new government in Tehran, constrained by truly representative institutions and the rule of law. A firm policy of deterrence, unlike a fleeting military strike, could hasten such positive political change.



DHS to Conduct Resiliency Assessment of East Coast Pipelines

HSToday By: Mickey McCarter

03/20/2012 (12:00am)

An infrastructure risk analysis center at the Department of Homeland Security (DHS) soon will undertake a study of two major east coast pipelines to determine their vulnerability to disruptions due to terrorist attack or natural disasters, the center’s director told a House panel Monday.

The DHS Homeland Infrastructure Threat and Risk Analysis Center (HITRAC) will undertake the analysis in part due to the scheduled idling of three Pennsylvania refineries, Brandon Wales, HITRAC director, testified before a field hearing of the House Homeland Security counterterrorism subcommittee in Aston, Pa.

Two Philadelphia-area refiners have idled and another will close by July, raising concerns by Rep. Patrick Meehan (R-Pa.), subcommittee chair, that the loss of production capacity for oil and natural gas could present homeland security challenges if the east coast were to experience a shortage in fuels after a terrorist attack or natural disaster.

Wales assured Meehan that alternate sources of fuels, including the Colonial Pipeline, foreign shipments and extra capacity from the Gulf coast, would prevent supply shortages after closure of the three refineries. Sunoco has closed two Philadelphia-area refineries and ConocoPhillips has announced the closure of a third by July.

Although the three refineries combined represent 50 percent of east coast fuels capacity, their closings should not have a significant homeland security impact, according to a HITRAC analysis.

The interconnectedness of the oil and natural gas network of refineries, pipelines and terminals represents a network that could experience disruptions when faced with a terrorist attack or natural disaster, particularly as an interruption in one part of the system could cascade to other parts, Wales acknowledged.

But Wales expressed confidence that “the system is dynamic” and the flow of oil or natural gas could be diverted, companies could surge production, and consumers could limit consumption to offset disruptions.

In 2011, HITRAC completed a model of the national system for fuel transportation, Wales said. With the model, HITRAC is able to estimate the affects of damage or disruption to system components. An initial analysis of the three refiner closures suggests they present a negligible impact on overall fuel availability as sufficient supply exists along the Gulf coast and other areas.

HITRAC tested its analysis against disruptions caused by a major hurricane hitting the Gulf coast. In that event, supply shortages would occur if the three Pennsylvania refineries closed or remained open. Supply disruptions were unlikely to occur further north of Washington, DC, and shipments from Europe and other areas would compensate for disruptions in the short term. In addition, the Environmental Protection Agency waives certain rules guiding gasoline additives and the like after some hurricanes to mitigate potential fuel shortages.

Localized shortages of specific fuels could occur, Wales said, but HITRAC analyses do not examine very localized availability of specific fuels. The center also does not project the affect of disruptions on the prices of individual commodities but rather examines the supply to specific regions after potential disruptions.

Meehan voiced concerns that US oil and natural gas infrastructure remains a favored target of Al Qaeda and other terrorists. Al Qaeda plots in 206 and 2007 targeted US fuel infrastructure and the group even supported a conspiracy to attack oil pipeline in Pennsylvania. Furthermore, intelligence gathered from last year’s raid on the Pakistan compound of Osama Bin Laden suggested Al Qaeda has a continued interest in attacking oil and natural gas installations.

HITAC has examined the potential threat to oil and natural gas infrastructure posed by international terrorist groups, Wales responded. The DHS Office of Infrastructure Protection has conducted more than 60 vulnerability assessments on national pipelines in the past several years. In addition, the infrastructure office has worked with states and localities on about 80 buffer zone protection plans that include integrated security planning related to pipelines.

Moreover, DHS has awarded up to $10 million in grant funds to local communities for pipeline security, Wales added.

HITRAC will build upon that work with its regional resiliency assessment of the Colonial Pipeline and the Plantation Pipeline throughout 2012 and 2013, Wales said. That assessment will include an examination of potential chokepoints such as pumping stations and a look at related infrastructure like electricity connectivity.

An initial assessment by HITRAC indicates that short-term disruptions to pipelines would not have a major impact. But prolonged disruptions of more than one or two weeks could sap excess inventory and begin to cause widespread fuel shortages, Wales said. The oil and natural gas industries have demonstrated the capability to repair damage to pipelines pretty quickly, however.

Rep. John Carney (D-Del.), who sat in on the hearing as a guest of the subcommittee, nonetheless press the point that a more distributed network is less vulnerable to disruption than a system focused in one geographic area like the Gulf coast. In addition, Carney questioned US ability to secure fuels being imported from overseas where it cannot exercise protection over sovereign territory.

HITRAC unfortunately has not analyzed such concerns, Wales said.

HITRAC, which analyzes risks to critical infrastructure and shares that information with the private sector, supports homeland security exercises, planning and responses.


The NSA Is Building the Country’s Biggest Spy Center (Watch What You Say)


By James Bamford

March 15, 2012 |


The spring air in the small, sand-dusted town has a soft haze to it, and clumps of green-gray sagebrush rustle in the breeze. Bluffdale sits in a bowl-shaped valley in the shadow of Utah’s Wasatch Range to the east and the Oquirrh Mountains to the west. It’s the heart of Mormon country, where religious pioneers first arrived more than 160 years ago. They came to escape the rest of the world, to understand the mysterious words sent down from their god as revealed on buried golden plates, and to practice what has become known as “the principle,” marriage to multiple wives.

Today Bluffdale is home to one of the nation’s largest sects of polygamists, the Apostolic United Brethren, with upwards of 9,000 members. The brethren’s complex includes a chapel, a school, a sports field, and an archive. Membership has doubled since 1978—and the number of plural marriages has tripled—so the sect has recently been looking for ways to purchase more land and expand throughout the town.

But new pioneers have quietly begun moving into the area, secretive outsiders who say little and keep to themselves. Like the pious polygamists, they are focused on deciphering cryptic messages that only they have the power to understand. Just off Beef Hollow Road, less than a mile from brethren headquarters, thousands of hard-hatted construction workers in sweat-soaked T-shirts are laying the groundwork for the newcomers’ own temple and archive, a massive complex so large that it necessitated expanding the town’s boundaries. Once built, it will be more than five times the size of the US Capitol.

Rather than Bibles, prophets, and worshippers, this temple will be filled with servers, computer intelligence experts, and armed guards. And instead of listening for words flowing down from heaven, these newcomers will be secretly capturing, storing, and analyzing vast quantities of words and images hurtling through the world’s telecommunications networks. In the little town of Bluffdale, Big Love and Big Brother have become uneasy neighbors.

The NSA has become the largest, most covert, and potentially most intrusive intelligence agency ever.

Under construction by contractors with top-secret clearances, the blandly named Utah Data Center is being built for the National Security Agency. A project of immense secrecy, it is the final piece in a complex puzzle assembled over the past decade. Its purpose: to intercept, decipher, analyze, and store vast swaths of the world’s communications as they zap down from satellites and zip through the underground and undersea cables of international, foreign, and domestic networks. The heavily fortified $2 billion center should be up and running in September 2013. Flowing through its servers and routers and stored in near-bottomless databases will be all forms of communication, including the complete contents of private emails, cell phone calls, and Google searches, as well as all sorts of personal data trails—parking receipts, travel itineraries, bookstore purchases, and other digital “pocket litter.” It is, in some measure, the realization of the “total information awareness” program created during the first term of the Bush administration—an effort that was killed by Congress in 2003 after it caused an outcry over its potential for invading Americans’ privacy.

But “this is more than just a data center,” says one senior intelligence official who until recently was involved with the program. The mammoth Bluffdale center will have another important and far more secret role that until now has gone unrevealed. It is also critical, he says, for breaking codes. And code-breaking is crucial, because much of the data that the center will handle—financial information, stock transactions, business deals, foreign military and diplomatic secrets, legal documents, confidential personal communications—will be heavily encrypted. According to another top official also involved with the program, the NSA made an enormous breakthrough several years ago in its ability to cryptanalyze, or break, unfathomably complex encryption systems employed by not only governments around the world but also many average computer users in the US. The upshot, according to this official: “Everybody’s a target; everybody with communication is a target.”

For the NSA, overflowing with tens of billions of dollars in post-9/11 budget awards, the cryptanalysis breakthrough came at a time of explosive growth, in size as well as in power. Established as an arm of the Department of Defense following Pearl Harbor, with the primary purpose of preventing another surprise assault, the NSA suffered a series of humiliations in the post-Cold War years. Caught offguard by an escalating series of terrorist attacks—the first World Trade Center bombing, the blowing up of US embassies in East Africa, the attack on the USS Cole in Yemen, and finally the devastation of 9/11—some began questioning the agency’s very reason for being. In response, the NSA has quietly been reborn. And while there is little indication that its actual effectiveness has improved—after all, despite numerous pieces of evidence and intelligence-gathering opportunities, it missed the near-disastrous attempted attacks by the underwear bomber on a flight to Detroit in 2009 and by the car bomber in Times Square in 2010—there is no doubt that it has transformed itself into the largest, most covert, and potentially most intrusive intelligence agency ever created.

In the process—and for the first time since Watergate and the other scandals of the Nixon administration—the NSA has turned its surveillance apparatus on the US and its citizens. It has established listening posts throughout the nation to collect and sift through billions of email messages and phone calls, whether they originate within the country or overseas. It has created a supercomputer of almost unimaginable speed to look for patterns and unscramble codes. Finally, the agency has begun building a place to store all the trillions of words and thoughts and whispers captured in its electronic net. And, of course, it’s all being done in secret. To those on the inside, the old adage that NSA stands for Never Say Anything applies more than ever.


When construction is completed in 2013, the heavily fortified $2 billion facility in Bluffdale will encompass 1 million square feet.

1 Visitor control center

A $9.7 million facility for ensuring that only cleared personnel gain access.

2 Administration

Designated space for technical support and administrative personnel.

3 Data halls

Four 25,000-square-foot facilities house rows and rows of servers.

4 Backup generators and fuel tanks

Can power the center for at least three days.

5 Water storage and pumping

Able to pump 1.7 million gallons of liquid per day.

6 Chiller plant

About 60,000 tons of cooling equipment to keep servers from overheating.

7 Power substation

An electrical substation to meet the center’s estimated 65-megawatt demand.

8 Security

Video surveillance, intrusion detection, and other protection will cost more than $10 million.

Source: U.S. Army Corps of Engineers Conceptual Site plan

A swath of freezing fog blanketed Salt Lake City on the morning of January 6, 2011, mixing with a weeklong coating of heavy gray smog. Red air alerts, warning people to stay indoors unless absolutely necessary, had become almost daily occurrences, and the temperature was in the bone-chilling twenties. “What I smell and taste is like coal smoke,” complained one local blogger that day. At the city’s international airport, many inbound flights were delayed or diverted while outbound regional jets were grounded. But among those making it through the icy mist was a figure whose gray suit and tie made him almost disappear into the background. He was tall and thin, with the physique of an aging basketball player and dark caterpillar eyebrows beneath a shock of matching hair. Accompanied by a retinue of bodyguards, the man was NSA deputy director Chris Inglis, the agency’s highest-ranking civilian and the person who ran its worldwide day-to-day operations.

A short time later, Inglis arrived in Bluffdale at the site of the future data center, a flat, unpaved runway on a little-used part of Camp Williams, a National Guard training site. There, in a white tent set up for the occasion, Inglis joined Harvey Davis, the agency’s associate director for installations and logistics, and Utah senator Orrin Hatch, along with a few generals and politicians in a surreal ceremony. Standing in an odd wooden sandbox and holding gold-painted shovels, they made awkward jabs at the sand and thus officially broke ground on what the local media had simply dubbed “the spy center.” Hoping for some details on what was about to be built, reporters turned to one of the invited guests, Lane Beattie of the Salt Lake Chamber of Commerce. Did he have any idea of the purpose behind the new facility in his backyard? “Absolutely not,” he said with a self-conscious half laugh. “Nor do I want them spying on me.”

For his part, Inglis simply engaged in a bit of double-talk, emphasizing the least threatening aspect of the center: “It’s a state-of-the-art facility designed to support the intelligence community in its mission to, in turn, enable and protect the nation’s cybersecurity.” While cybersecurity will certainly be among the areas focused on in Bluffdale, what is collected, how it’s collected, and what is done with the material are far more important issues. Battling hackers makes for a nice cover—it’s easy to explain, and who could be against it? Then the reporters turned to Hatch, who proudly described the center as “a great tribute to Utah,” then added, “I can’t tell you a lot about what they’re going to be doing, because it’s highly classified.”

And then there was this anomaly: Although this was supposedly the official ground-breaking for the nation’s largest and most expensive cybersecurity project, no one from the Department of Homeland Security, the agency responsible for protecting civilian networks from cyberattack, spoke from the lectern. In fact, the official who’d originally introduced the data center, at a press conference in Salt Lake City in October 2009, had nothing to do with cybersecurity. It was Glenn A. Gaffney, deputy director of national intelligence for collection, a man who had spent almost his entire career at the CIA. As head of collection for the intelligence community, he managed the country’s human and electronic spies.

Within days, the tent and sandbox and gold shovels would be gone and Inglis and the generals would be replaced by some 10,000 construction workers. “We’ve been asked not to talk about the project,” Rob Moore, president of Big-D Construction, one of the three major contractors working on the project, told a local reporter. The plans for the center show an extensive security system: an elaborate $10 million antiterrorism protection program, including a fence designed to stop a 15,000-pound vehicle traveling 50 miles per hour, closed-circuit cameras, a biometric identification system, a vehicle inspection facility, and a visitor-control center.

Inside, the facility will consist of four 25,000-square-foot halls filled with servers, complete with raised floor space for cables and storage. In addition, there will be more than 900,000 square feet for technical support and administration. The entire site will be self-sustaining, with fuel tanks large enough to power the backup generators for three days in an emergency, water storage with the capability of pumping 1.7 million gallons of liquid per day, as well as a sewage system and massive air-conditioning system to keep all those servers cool. Electricity will come from the center’s own substation built by Rocky Mountain Power to satisfy the 65-megawatt power demand. Such a mammoth amount of energy comes with a mammoth price tag—about $40 million a year, according to one estimate.

Given the facility’s scale and the fact that a terabyte of data can now be stored on a flash drive the size of a man’s pinky, the potential amount of information that could be housed in Bluffdale is truly staggering. But so is the exponential growth in the amount of intelligence data being produced every day by the eavesdropping sensors of the NSA and other intelligence agencies. As a result of this “expanding array of theater airborne and other sensor networks,” as a 2007 Department of Defense report puts it, the Pentagon is attempting to expand its worldwide communications network, known as the Global Information Grid, to handle yottabytes (1024 bytes) of data. (A yottabyte is a septillion bytes—so large that no one has yet coined a term for the next higher magnitude.)

It needs that capacity because, according to a recent report by Cisco, global Internet traffic will quadruple from 2010 to 2015, reaching 966 exabytes per year. (A million exabytes equal a yottabyte.) In terms of scale, Eric Schmidt, Google’s former CEO, once estimated that the total of all human knowledge created from the dawn of man to 2003 totaled 5 exabytes. And the data flow shows no sign of slowing. In 2011 more than 2 billion of the world’s 6.9 billion people were connected to the Internet. By 2015, market research firm IDC estimates, there will be 2.7 billion users. Thus, the NSA’s need for a 1-million-square-foot data storehouse. Should the agency ever fill the Utah center with a yottabyte of information, it would be equal to about 500 quintillion (500,000,000,000,000,000,000) pages of text.

The data stored in Bluffdale will naturally go far beyond the world’s billions of public web pages. The NSA is more interested in the so-called invisible web, also known as the deep web or deepnet—data beyond the reach of the public. This includes password-protected data, US and foreign government communications, and noncommercial file-sharing between trusted peers. “The deep web contains government reports, databases, and other sources of information of high value to DOD and the intelligence community,” according to a 2010 Defense Science Board report. “Alternative tools are needed to find and index data in the deep web … Stealing the classified secrets of a potential adversary is where the [intelligence] community is most comfortable.” With its new Utah Data Center, the NSA will at last have the technical capability to store, and rummage through, all those stolen secrets. The question, of course, is how the agency defines who is, and who is not, “a potential adversary.”


Once it’s operational, the Utah Data Center will become, in effect, the NSA’s cloud. The center will be fed data collected by the agency’s eavesdropping satellites, overseas listening posts, and secret monitoring rooms in telecom facilities throughout the US. All that data will then be accessible to the NSA’s code breakers, data-miners, China analysts, counterterrorism specialists, and others working at its Fort Meade headquarters and around the world. Here’s how the data center appears to fit into the NSA’s global puzzle.—J.B.

1 Geostationary satellites

Four satellites positioned around the globe monitor frequencies carrying everything from walkie-talkies and cell phones in Libya to radar systems in North Korea. Onboard software acts as the first filter in the collection process, targeting only key regions, countries, cities, and phone numbers or email.

2 Aerospace Data Facility, Buckley Air Force Base, Colorado

Intelligence collected from the geostationary satellites, as well as signals from other spacecraft and overseas listening posts, is relayed to this facility outside Denver. About 850 NSA employees track the satellites, transmit target information, and download the intelligence haul.

3 NSA Georgia, Fort Gordon, Augusta, Georgia

Focuses on intercepts from Europe, the Middle East, and North Africa. Codenamed Sweet Tea, the facility has been massively expanded and now consists of a 604,000-square-foot operations building for up to 4,000 intercept operators, analysts, and other specialists.

4 NSA Texas, Lackland Air Force Base, San Antonio

Focuses on intercepts from Latin America and, since 9/11, the Middle East and Europe. Some 2,000 workers staff the operation. The NSA recently completed a $100 million renovation on a mega-data center here—a backup storage facility for the Utah Data Center.

5 NSA Hawaii, Oahu

Focuses on intercepts from Asia. Built to house an aircraft assembly plant during World War II, the 250,000-square-foot bunker is nicknamed the Hole. Like the other NSA operations centers, it has since been expanded: Its 2,700 employees now do their work aboveground from a new 234,000-square-foot facility.

6 Domestic listening posts

The NSA has long been free to eavesdrop on international satellite communications. But after 9/11, it installed taps in US telecom “switches,” gaining access to domestic traffic. An ex-NSA official says there are 10 to 20 such installations.

7 Overseas listening posts

According to a knowledgeable intelligence source, the NSA has installed taps on at least a dozen of the major overseas communications links, each capable of eavesdropping on information passing by at a high data rate.

8 Utah Data Center, Bluffdale, Utah

At a million square feet, this $2 billion digital storage facility outside Salt Lake City will be the centerpiece of the NSA’s cloud-based data strategy and essential in its plans for decrypting previously uncrackable documents.

9 Multiprogram Research Facility, Oak Ridge, Tennessee

Some 300 scientists and computer engineers with top security clearance toil away here, building the world’s fastest supercomputers and working on cryptanalytic applications and other secret projects.

10 NSA headquarters, Fort Meade, Maryland

Analysts here will access material stored at Bluffdale to prepare reports and recommendations that are sent to policymakers. To handle the increased data load, the NSA is also building an $896 million supercomputer center here.

Before yottabytes of data from the deep web and elsewhere can begin piling up inside the servers of the NSA’s new center, they must be collected. To better accomplish that, the agency has undergone the largest building boom in its history, including installing secret electronic monitoring rooms in major US telecom facilities. Controlled by the NSA, these highly secured spaces are where the agency taps into the US communications networks, a practice that came to light during the Bush years but was never acknowledged by the agency. The broad outlines of the so-called warrantless-wiretapping program have long been exposed—how the NSA secretly and illegally bypassed the Foreign Intelligence Surveillance Court, which was supposed to oversee and authorize highly targeted domestic eavesdropping; how the program allowed wholesale monitoring of millions of American phone calls and email. In the wake of the program’s exposure, Congress passed the FISA Amendments Act of 2008, which largely made the practices legal. Telecoms that had agreed to participate in the illegal activity were granted immunity from prosecution and lawsuits. What wasn’t revealed until now, however, was the enormity of this ongoing domestic spying program.

For the first time, a former NSA official has gone on the record to describe the program, codenamed Stellar Wind, in detail. William Binney was a senior NSA crypto-mathematician largely responsible for automating the agency’s worldwide eavesdropping network. A tall man with strands of black hair across the front of his scalp and dark, determined eyes behind thick-rimmed glasses, the 68-year-old spent nearly four decades breaking codes and finding new ways to channel billions of private phone calls and email messages from around the world into the NSA’s bulging databases. As chief and one of the two cofounders of the agency’s Signals Intelligence Automation Research Center, Binney and his team designed much of the infrastructure that’s still likely used to intercept international and foreign communications.

He explains that the agency could have installed its tapping gear at the nation’s cable landing stations—the more than two dozen sites on the periphery of the US where fiber-optic cables come ashore. If it had taken that route, the NSA would have been able to limit its eavesdropping to just international communications, which at the time was all that was allowed under US law. Instead it chose to put the wiretapping rooms at key junction points throughout the country—large, windowless buildings known as switches—thus gaining access to not just international communications but also to most of the domestic traffic flowing through the US. The network of intercept stations goes far beyond the single room in an AT&T building in San Francisco exposed by a whistle-blower in 2006. “I think there’s 10 to 20 of them,” Binney says. “That’s not just San Francisco; they have them in the middle of the country and also on the East Coast.”

The eavesdropping on Americans doesn’t stop at the telecom switches. To capture satellite communications in and out of the US, the agency also monitors AT&T’s powerful earth stations, satellite receivers in locations that include Roaring Creek and Salt Creek. Tucked away on a back road in rural Catawissa, Pennsylvania, Roaring Creek’s three 105-foot dishes handle much of the country’s communications to and from Europe and the Middle East. And on an isolated stretch of land in remote Arbuckle, California, three similar dishes at the company’s Salt Creek station service the Pacific Rim and Asia.

The former NSA official held his thumb and forefinger close together: “We are that far from a turnkey totalitarian state.”

Binney left the NSA in late 2001, shortly after the agency launched its warrantless-wiretapping program. “They violated the Constitution setting it up,” he says bluntly. “But they didn’t care. They were going to do it anyway, and they were going to crucify anyone who stood in the way. When they started violating the Constitution, I couldn’t stay.” Binney says Stellar Wind was far larger than has been publicly disclosed and included not just eavesdropping on domestic phone calls but the inspection of domestic email. At the outset the program recorded 320 million calls a day, he says, which represented about 73 to 80 percent of the total volume of the agency’s worldwide intercepts. The haul only grew from there. According to Binney—who has maintained close contact with agency employees until a few years ago—the taps in the secret rooms dotting the country are actually powered by highly sophisticated software programs that conduct “deep packet inspection,” examining Internet traffic as it passes through the 10-gigabit-per-second cables at the speed of light.

The software, created by a company called Narus that’s now part of Boeing, is controlled remotely from NSA headquarters at Fort Meade in Maryland and searches US sources for target addresses, locations, countries, and phone numbers, as well as watch-listed names, keywords, and phrases in email. Any communication that arouses suspicion, especially those to or from the million or so people on agency watch lists, are automatically copied or recorded and then transmitted to the NSA.

The scope of surveillance expands from there, Binney says. Once a name is entered into the Narus database, all phone calls and other communications to and from that person are automatically routed to the NSA’s recorders. “Anybody you want, route to a recorder,” Binney says. “If your number’s in there? Routed and gets recorded.” He adds, “The Narus device allows you to take it all.” And when Bluffdale is completed, whatever is collected will be routed there for storage and analysis.

According to Binney, one of the deepest secrets of the Stellar Wind program—again, never confirmed until now—was that the NSA gained warrantless access to AT&T’s vast trove of domestic and international billing records, detailed information about who called whom in the US and around the world. As of 2007, AT&T had more than 2.8 trillion records housed in a database at its Florham Park, New Jersey, complex.

Verizon was also part of the program, Binney says, and that greatly expanded the volume of calls subject to the agency’s domestic eavesdropping. “That multiplies the call rate by at least a factor of five,” he says. “So you’re over a billion and a half calls a day.” (Spokespeople for Verizon and AT&T said their companies would not comment on matters of national security.)

After he left the NSA, Binney suggested a system for monitoring people’s communications according to how closely they are connected to an initial target. The further away from the target—say you’re just an acquaintance of a friend of the target—the less the surveillance. But the agency rejected the idea, and, given the massive new storage facility in Utah, Binney suspects that it now simply collects everything. “The whole idea was, how do you manage 20 terabytes of intercept a minute?” he says. “The way we proposed was to distinguish between things you want and things you don’t want.” Instead, he adds, “they’re storing everything they gather.” And the agency is gathering as much as it can.

Once the communications are intercepted and stored, the data-mining begins. “You can watch everybody all the time with data- mining,” Binney says. Everything a person does becomes charted on a graph, “financial transactions or travel or anything,” he says. Thus, as data like bookstore receipts, bank statements, and commuter toll records flow in, the NSA is able to paint a more and more detailed picture of someone’s life.

The NSA also has the ability to eavesdrop on phone calls directly and in real time. According to Adrienne J. Kinne, who worked both before and after 9/11 as a voice interceptor at the NSA facility in Georgia, in the wake of the World Trade Center attacks “basically all rules were thrown out the window, and they would use any excuse to justify a waiver to spy on Americans.” Even journalists calling home from overseas were included. “A lot of time you could tell they were calling their families,” she says, “incredibly intimate, personal conversations.” Kinne found the act of eavesdropping on innocent fellow citizens personally distressing. “It’s almost like going through and finding somebody’s diary,” she says.

In secret listening rooms nationwide, NSA software examines every email, phone call, and tweet as they zip by.

But there is, of course, reason for anyone to be distressed about the practice. Once the door is open for the government to spy on US citizens, there are often great temptations to abuse that power for political purposes, as when Richard Nixon eavesdropped on his political enemies during Watergate and ordered the NSA to spy on antiwar protesters. Those and other abuses prompted Congress to enact prohibitions in the mid-1970s against domestic spying.

Before he gave up and left the NSA, Binney tried to persuade officials to create a more targeted system that could be authorized by a court. At the time, the agency had 72 hours to obtain a legal warrant, and Binney devised a method to computerize the system. “I had proposed that we automate the process of requesting a warrant and automate approval so we could manage a couple of million intercepts a day, rather than subvert the whole process.” But such a system would have required close coordination with the courts, and NSA officials weren’t interested in that, Binney says. Instead they continued to haul in data on a grand scale. Asked how many communications—”transactions,” in NSA’s lingo—the agency has intercepted since 9/11, Binney estimates the number at “between 15 and 20 trillion, the aggregate over 11 years.”

When Barack Obama took office, Binney hoped the new administration might be open to reforming the program to address his constitutional concerns. He and another former senior NSA analyst, J. Kirk Wiebe, tried to bring the idea of an automated warrant-approval system to the attention of the Department of Justice’s inspector general. They were given the brush-off. “They said, oh, OK, we can’t comment,” Binney says.

Sitting in a restaurant not far from NSA headquarters, the place where he spent nearly 40 years of his life, Binney held his thumb and forefinger close together. “We are, like, that far from a turnkey totalitarian state,” he says.

There is still one technology preventing untrammeled government access to private digital data: strong encryption. Anyone—from terrorists and weapons dealers to corporations, financial institutions, and ordinary email senders—can use it to seal their messages, plans, photos, and documents in hardened data shells. For years, one of the hardest shells has been the Advanced Encryption Standard, one of several algorithms used by much of the world to encrypt data. Available in three different strengths—128 bits, 192 bits, and 256 bits—it’s incorporated in most commercial email programs and web browsers and is considered so strong that the NSA has even approved its use for top-secret US government communications. Most experts say that a so-called brute-force computer attack on the algorithm—trying one combination after another to unlock the encryption—would likely take longer than the age of the universe. For a 128-bit cipher, the number of trial-and-error attempts would be 340 undecillion (1036).

Breaking into those complex mathematical shells like the AES is one of the key reasons for the construction going on in Bluffdale. That kind of cryptanalysis requires two major ingredients: super-fast computers to conduct brute-force attacks on encrypted messages and a massive number of those messages for the computers to analyze. The more messages from a given target, the more likely it is for the computers to detect telltale patterns, and Bluffdale will be able to hold a great many messages. “We questioned it one time,” says another source, a senior intelligence manager who was also involved with the planning. “Why were we building this NSA facility? And, boy, they rolled out all the old guys—the crypto guys.” According to the official, these experts told then-director of national intelligence Dennis Blair, “You’ve got to build this thing because we just don’t have the capability of doing the code-breaking.” It was a candid admission. In the long war between the code breakers and the code makers—the tens of thousands of cryptographers in the worldwide computer security industry—the code breakers were admitting defeat.

So the agency had one major ingredient—a massive data storage facility—under way. Meanwhile, across the country in Tennessee, the government was working in utmost secrecy on the other vital element: the most powerful computer the world has ever known.

The plan was launched in 2004 as a modern-day Manhattan Project. Dubbed the High Productivity Computing Systems program, its goal was to advance computer speed a thousandfold, creating a machine that could execute a quadrillion (1015) operations a second, known as a petaflop—the computer equivalent of breaking the land speed record. And as with the Manhattan Project, the venue chosen for the supercomputing program was the town of Oak Ridge in eastern Tennessee, a rural area where sharp ridges give way to low, scattered hills, and the southwestward-flowing Clinch River bends sharply to the southeast. About 25 miles from Knoxville, it is the “secret city” where uranium- 235 was extracted for the first atomic bomb. A sign near the exit read: what you see here, what you do here, what you hear here, when you leave here, let it stay here. Today, not far from where that sign stood, Oak Ridge is home to the Department of Energy’s Oak Ridge National Laboratory, and it’s engaged in a new secret war. But this time, instead of a bomb of almost unimaginable power, the weapon is a computer of almost unimaginable speed.

In 2004, as part of the supercomputing program, the Department of Energy established its Oak Ridge Leadership Computing Facility for multiple agencies to join forces on the project. But in reality there would be two tracks, one unclassified, in which all of the scientific work would be public, and another top-secret, in which the NSA could pursue its own computer covertly. “For our purposes, they had to create a separate facility,” says a former senior NSA computer expert who worked on the project and is still associated with the agency. (He is one of three sources who described the program.) It was an expensive undertaking, but one the NSA was desperate to launch.

Known as the Multiprogram Research Facility, or Building 5300, the $41 million, five-story, 214,000-square-foot structure was built on a plot of land on the lab’s East Campus and completed in 2006. Behind the brick walls and green-tinted windows, 318 scientists, computer engineers, and other staff work in secret on the cryptanalytic applications of high-speed computing and other classified projects. The supercomputer center was named in honor of George R. Cotter, the NSA’s now-retired chief scientist and head of its information technology program. Not that you’d know it. “There’s no sign on the door,” says the ex-NSA computer expert.

At the DOE’s unclassified center at Oak Ridge, work progressed at a furious pace, although it was a one-way street when it came to cooperation with the closemouthed people in Building 5300. Nevertheless, the unclassified team had its Cray XT4 supercomputer upgraded to a warehouse-sized XT5. Named Jaguar for its speed, it clocked in at 1.75 petaflops, officially becoming the world’s fastest computer in 2009.

Meanwhile, over in Building 5300, the NSA succeeded in building an even faster supercomputer. “They made a big breakthrough,” says another former senior intelligence official, who helped oversee the program. The NSA’s machine was likely similar to the unclassified Jaguar, but it was much faster out of the gate, modified specifically for cryptanalysis and targeted against one or more specific algorithms, like the AES. In other words, they were moving from the research and development phase to actually attacking extremely difficult encryption systems. The code-breaking effort was up and running.

The breakthrough was enormous, says the former official, and soon afterward the agency pulled the shade down tight on the project, even within the intelligence community and Congress. “Only the chairman and vice chairman and the two staff directors of each intelligence committee were told about it,” he says. The reason? “They were thinking that this computing breakthrough was going to give them the ability to crack current public encryption.”

In addition to giving the NSA access to a tremendous amount of Americans’ personal data, such an advance would also open a window on a trove of foreign secrets. While today most sensitive communications use the strongest encryption, much of the older data stored by the NSA, including a great deal of what will be transferred to Bluffdale once the center is complete, is encrypted with more vulnerable ciphers. “Remember,” says the former intelligence official, “a lot of foreign government stuff we’ve never been able to break is 128 or less. Break all that and you’ll find out a lot more of what you didn’t know—stuff we’ve already stored—so there’s an enormous amount of information still in there.”

The NSA believes it’s on the verge of breaking a key encryption algorithm—opening up hoards of data.

That, he notes, is where the value of Bluffdale, and its mountains of long-stored data, will come in. What can’t be broken today may be broken tomorrow. “Then you can see what they were saying in the past,” he says. “By extrapolating the way they did business, it gives us an indication of how they may do things now.” The danger, the former official says, is that it’s not only foreign government information that is locked in weaker algorithms, it’s also a great deal of personal domestic communications, such as Americans’ email intercepted by the NSA in the past decade.

But first the supercomputer must break the encryption, and to do that, speed is everything. The faster the computer, the faster it can break codes. The Data Encryption Standard, the 56-bit predecessor to the AES, debuted in 1976 and lasted about 25 years. The AES made its first appearance in 2001 and is expected to remain strong and durable for at least a decade. But if the NSA has secretly built a computer that is considerably faster than machines in the unclassified arena, then the agency has a chance of breaking the AES in a much shorter time. And with Bluffdale in operation, the NSA will have the luxury of storing an ever-expanding archive of intercepts until that breakthrough comes along.

But despite its progress, the agency has not finished building at Oak Ridge, nor is it satisfied with breaking the petaflop barrier. Its next goal is to reach exaflop speed, one quintillion (1018) operations a second, and eventually zettaflop (1021) and yottaflop.

These goals have considerable support in Congress. Last November a bipartisan group of 24 senators sent a letter to President Obama urging him to approve continued funding through 2013 for the Department of Energy’s exascale computing initiative (the NSA’s budget requests are classified). They cited the necessity to keep up with and surpass China and Japan. “The race is on to develop exascale computing capabilities,” the senators noted. The reason was clear: By late 2011 the Jaguar (now with a peak speed of 2.33 petaflops) ranked third behind Japan’s “K Computer,” with an impressive 10.51 petaflops, and the Chinese Tianhe-1A system, with 2.57 petaflops.

But the real competition will take place in the classified realm. To secretly develop the new exaflop (or higher) machine by 2018, the NSA has proposed constructing two connecting buildings, totaling 260,000 square feet, near its current facility on the East Campus of Oak Ridge. Called the Multiprogram Computational Data Center, the buildings will be low and wide like giant warehouses, a design necessary for the dozens of computer cabinets that will compose an exaflop-scale machine, possibly arranged in a cluster to minimize the distance between circuits. According to a presentation delivered to DOE employees in 2009, it will be an “unassuming facility with limited view from roads,” in keeping with the NSA’s desire for secrecy. And it will have an extraordinary appetite for electricity, eventually using about 200 megawatts, enough to power 200,000 homes. The computer will also produce a gargantuan amount of heat, requiring 60,000 tons of cooling equipment, the same amount that was needed to serve both of the World Trade Center towers.

In the meantime Cray is working on the next step for the NSA, funded in part by a $250 million contract with the Defense Advanced Research Projects Agency. It’s a massively parallel supercomputer called Cascade, a prototype of which is due at the end of 2012. Its development will run largely in parallel with the unclassified effort for the DOE and other partner agencies. That project, due in 2013, will upgrade the Jaguar XT5 into an XK6, codenamed Titan, upping its speed to 10 to 20 petaflops.

Yottabytes and exaflops, septillions and undecillions—the race for computing speed and data storage goes on. In his 1941 story “The Library of Babel,” Jorge Luis Borges imagined a collection of information where the entire world’s knowledge is stored but barely a single word is understood. In Bluffdale the NSA is constructing a library on a scale that even Borges might not have contemplated. And to hear the masters of the agency tell it, it’s only a matter of time until every word is illuminated.

James Bamford ( is the author of The Shadow Factory: The Ultra-Secret NSA from 9/11 to the Eavesdropping on America.


March 19, 2012 – 10:54 p.m.

Pentagon Pursues Additional Base Closures

By Megan Scully, CQ Staff

Senior defense officials are warning that congressional resistance to the Pentagon’s request for more base closures will not necessarily deter them from shuttering unneeded installations across the country.

The Defense Department has argued that planned reductions in military personnel and force structure will leave the department with excess infrastructure that is too costly to maintain, particularly as defense budgets decline after a decade of historic wartime growth.

As a result, the Pentagon is expected to soon present Congress with draft legislation that would establish formal base-closure rounds in 2013 and 2015.

Defense officials have not said how many bases they would like to trim, or how much money they would aim to save. In the United States alone, the military has about 500 permanent installations, including more than 300,000 buildings and about 200,000 other physical structures, with a replacement value of more than $800 billion.

The Air Force, which was spared significant closures in the last Base Realignment and Closure (BRAC) round in 2005, has estimated that more than 20 percent of its infrastructure could be considered excess capacity.

For their part, however, many in Congress are reluctant to close jobs-producing bases that fuel the economy in districts and states around the country. Lawmakers from both chambers and both parties have said they would fight any efforts to authorize another formal BRAC round, which would include the creation of an independent commission tasked with trimming the military’s domestic infrastructure.

But if Congress refuses the Pentagon’s request for further BRAC rounds, the Defense Department has made clear that it has other options it can exercise to shed unnecessary infrastructure.

“Given the fiscal and strategic imperatives we face, if Congress does not authorize additional BRAC rounds, the department will be forced to use its existing authorities to begin to realign and close bases,” Dorothy Robyn, the deputy undersecretary of defense for installations and environment, told the House Armed Services Readiness Subcommittee on March 8.

Unilateral Action

Since the early 1990s, Congress has authorized the creation of four bipartisan base-closure commissions, which studied the Defense Department’s recommendations and received input from affected communities before issuing a final list of bases to be closed or realigned. Lawmakers then voted on whether or not to accept the commission’s recommendations as a whole, rather than on a base-by-base basis.

But the Defense Department can unilaterally close smaller installations that employ fewer than 300 civilian personnel and make other significant changes at bases without congressional consent. The Air Force, for instance, announced in its fiscal 2013 budget request its plans to close the Air Reserve Station in Pittsburgh, a move that does not require sign-off from lawmakers.

The military has the “authority to do small things under current law. They’ve done small things under law,” Senate Armed Services Chairman Carl Levin, D-Mich., who has urged the Defense Department to trim its overseas infrastructure before closing domestic bases, said March 15. “Whatever authority they have, they’ve used and, I presume, will continue to use.”

The department, meanwhile, also has some limited authority to close larger installations outside of the formal BRAC commission process.

Until the late 1970s, the Defense Department had unlimited authority to close bases. But a rarely used 1977 law (PL 95-82) requires the Pentagon to notify Congress as part of its budget request of any plans to close specific bases. Congress then has 30 legislative days or 60 calendar days (whichever is longer) to block the move. If lawmakers don’t act, the Defense Department can shutter the base.

Advantages of an Orderly Process

Some base-closure experts consider the law to be little more than a speed bump for the department — albeit a time-consuming one that requires the Pentagon to present lawmakers with extensive documentation on the environmental impact of base closures, as well as other information on economic, strategic and operational consequences of each closure or realignment.

Others, however, argue that the formal BRAC commission process is in the interests of both Congress and the Pentagon.

These experts argue that the process forces defense officials to conduct a thorough review of the military’s basing needs that is then evaluated and tweaked by an apolitical commission.

“In theory, the BRAC process as it has been constituted creates a process where all installations are equal,” said Brian Lepore, a base-closure analyst at the Government Accountability Office. The commission, Lepore added, helps provide assurances that the process is as clear and fair as it can be.

David Berteau, an analyst at the Center for Strategic & International Studies who served on BRAC commissions during the 1990s, said the formal BRAC process also forces the Defense Department to consider its long-term force structure plans before making any decisions on individual bases. The commission process also allows lawmakers to plead their case to the commission and make their public arguments against closing facilities in their areas, Berteau added.

The Defense Department also argues that a formal base-closure process, rather than ad hoc closures, benefits local communities. During the last three BRAC rounds, defense officials worked with communities to help develop plans for the transition after bases closed. But if the Defense Department works outside the BRAC process, local communities would have no role in the disposition of military property.

“If the department were forced to begin the closure and realign process using existing authorities, communities would have to fend for themselves to a much greater degree,” Robyn told House lawmakers.



NSA Chief: Cyber Becoming More Perilous

At Hearing, DoD CIO Takai Also Outlines 2-Prong Cloud Defense


Eric Chabrow
March 21, 2012

Army Gen. Keith Alexander, the head of the U.S. military’s Cyber Command and National Security Agency, painted a bleak picture with mounting challenges to the Defense Department’s and nation’s IT systems.

“In framing my comments on our progress at Cyber Command, I have to begin by noting a worrisome fact: cyberspace is becoming more dangerous,” Alexander said in testimony delivered to the House Armed Services Subcommittee on Emerging Threats and Capabilities.

At that same hearing on President Obama’s $37 billion Defense Department IT budget request, which includes $3.4 billion for IT security, DoD Chief Information Officer Teresa Takai said the department will employ a two-prong approach – securing the perimeter as well as the data – as information and services are moved to standardized cloud computing platforms. “We’re going to be able to better protect as we get more standardized,” Takai said.

Data Center Consolidation

DoD’s cloud initiative is part of the department’s consolidation of data centers, from more than 770 to about 655 in less than two years. “Core data centers will be used for information services and applications that must be available broadly across DoD, and for the department’s outward-facing applications and services required for interaction with industry and the public,” the CIO said. “These will, in fact, become the initial DoD cloud computing instantiation.”

As DoD fortifies its cloud offerings, Takai recognizes breaches will occur. “We need to be able to protect at the information level,” she said. “That is why we’re focusing very much on identity management so we know who is in the cloud. And, we’re also linking that to what information that particular individual has access. It’s really both of those that gives us assurance so that as we move to that kind of an architecture, we will be able to better protect our information.”

Alexander concurred: “The IT infrastructure of the future – the STIn (Security Technical Implementation) virtual cloud environment – will make it a much more defensible architecture. I think that’s the key to the future.”

Anxiety over Cyber Destruction

Addressing the cyberthreats the nation faces, Alexander characterized them as three-fold:

  • Exploitation, such as the theft of intellectual property;
  • Disruption, such as the distributed denial of service attacks that disabled government IT in Estonia and neighboring nations;
  • Destruction. “What we’re concerned about is shifting from exploitation to disruptive attacks to destructive attacks,” Alexander said. “Those attacks that could destroy equipment are on the horizon and we have to be prepared for them.”

It’s not that advancements haven’t been made in cyber protection over the past year since the military stood up the Cyber Command. Alexander said organizations are better in identifying botnets, although he quickly added that didn’t mean the computing environment is getting safer. “Now,” the four-star general said, “the more sophisticated cyber criminals are shifting toward stealthier, targeted thefts of sensitive data they can sell … targeting (organizations) with similar malware, often spread by clever phishing e-mails that hit an information security system at its weakest point – the user.”

Subcommittee Chairman Mac Thornberry, the Texas Republican who leads House cybersecurity efforts, lamented the deteriorating security in cyberspace. “Despite the successes of Cyber Command over the past year, which I do not discount, it still seems to me that the dangers to our nation in cyberspace are growing faster than our ability to protect the country,” he said.

The panel’s ranking Democrat, Jim Langevin of Rhode Island, said that despite increased awareness of cyber vulnerabilities, many in the public and Congress don’t fully recognize the potential for damage posed by a breached or disrupted network.

Social Media Can Cause Damage

Alexander said it’s not just bold attacks on critical IT infrastructure that worries him; social media and mobile devices present additional security challenges. “Real and potential adversaries can and do learn a great deal about our personnel, procedures and deployments by monitoring the use that our people make of popular social media,” he said. “As our military goes wireless, these threats to our weapons systems, communications, databases and personnel demand attention.”

And though hacktivists’ threats aren’t as perilous as those of criminal gangs and nation states – more “chaotic and perhaps exaggerated in the media,” as Alexander puts it – the actions of groups such as Anonymous and LulzSec prove troublesome. “The work of preventing those effects from disrupting DoD information systems does draw attention and resources,” he said.

IT Security Skills Shortage

Alexander suggested that the nation’s prosperity and security are threatened by a critical shortage of IT security skills and personnel. At the DoD, Alexander said efforts to build a cybersecurity workforce will require adoption of a single standard. He said the DoD is reviewing recruitment and incentive programs to build and retain the “best of the best cyber defenders,” and is working to standardize, track and manage the training needed for all cyber personnel.

“In order to achieve our goals in this area by 2014,” he said, “we must build a skilled force capable of full-spectrum cyber operations across a continuum of threats. We also need to build our workforce at Cyber Command and the service cyber components so that, in extremis, we have the capability to defend the nation in cyberspace.”



Senators: DoD can cut redundant programs

Army times

By Rick Maze – Staff writer
Posted : Tuesday Mar 20, 2012 16:31:30 EDT

Top of Form

Bottom of Form

Two key Republican senators are recommending the Defense Department reduce duplicative and overlapping programs to stave off deep cuts in force structure and weapons programs.

Sens. John McCain of Arizona, ranking Republican on the Senate Armed Services Committee, and Tom Coburn of Oklahoma, one of the Senate’s fiercest deficit hawks, have provided some specific examples, drawn from a recent Government Accountability Office report on mismanagement that identified 51 areas of concern, including 16 in DoD.

In a March 19 letter to Defense Secretary Leon Panetta, the senators made the recommendations “to avoid imposing the large cuts to force structure and key investment priorities that the president proposed in the fiscal year 2013 budget request.”

This is McCain’s latest advice to the Pentagon about spending priorities. Last week, he announced he would stop approving the transfer of funds between defense accounts to pay for the launch of any new program not specifically authorized by Congress. On Monday, he and Sen. Carl Levin, D-Mich., the Senate Armed Services Committee chairman, asked Panetta to postpone any actions to change military force structure until the committee has a chance to thoroughly review the fiscal 2013 budget proposed by the Obama administration.

Among the areas for savings identified by GAO in a Feb. 28 report to Congress:

• The Veterans Affairs Department and Pentagon run similar and overlapping programs to help severely wounded combat veterans that are inefficient and difficult for the wounded and their families to follow. In one instance, GAO found five case managers working on the same life insurance issue for one person. It also found a service member with multiple amputations receiving conflicting recovery plans from military and VA care coordinators.

• All four services have their own electronic warfare programs that are developing multiple systems with the same or similar capabilities, and the services and some components are working on their own unmanned aircraft systems with overlapping capabilities but different priorities.

• Each service has its own language and cultural training program, and some services have more than one program, without Defense Department guidance to integrate the programs and training products.

Medical research is filled with duplicative projects that could be consolidated if DoD, VA and the National Institute of Health did a better job sharing information.

• The services each have their own large medical command structures that could be streamlined to save as much as $460 million a year.

McCain and Coburn are asking for detailed replies from DoD about why it has not made some of the cost-saving recommended by GAO.


Senator Vows to Block Pentagon Base Closure Attempts

Defense News

Mar. 21, 2012 – 12:59PM |


A Pentagon base closing initiative died March 21 when a key U.S. senator announced she will block any effort to create a new base closing commission.

Sen. Claire McCaskill of Missouri, chairwoman of the Senate Armed Services Committee panel with jurisdiction over military installations, said she is willing to allow the closing of U.S. military bases overseas, but not domestic bases.

There is strong opposition to new base closures on the House Armed Services Committee, which also appears unlikely to approve the legislation necessary to create a base closing list.

Unless the Pentagon can change McCaskill’s mind, her announcement seals the fate of the proposal. It takes only a single senator to hold up legislation.

The death of the initiative comes before the Defense Department has even submitted to Congress a legislative plan for making the base closing plan possible.

McCaskill, who heads the subcommittee on military readiness and management support, said she applauds the Defense Department’s “desire to find responsible places to achieve savings” but “there is one area where there is absolutely no room for compromise this year: BRAC.”

BRAC refers to the independent base closing and realignment commission process the Defense Department has depended on in five previous rounds to get Congress to support shutting down bases. The commission makes recommendations that have to be accepted or rejected as a package, with no ability to make modifications.

Defense Department officials have proposed two additional base-closing rounds, one in 2013 and a second in 2015, for closing bases and adjusting forces — moves they have said are vital to cutting infrastructure costs to deal with tighter defense budgets.

McCaskill’s objections are similar to those raised by the readiness subcommittee of the House Armed Services Committee, which also wants to see a complete force structure plan for overseas and domestic bases before going forward, as well as estimates on how much money could be saved by closing more bases. There are upfront costs to base closing, which gives lawmakers a reason to delay the cuts without looking like they are just trying to protect bases back home.

However, McCaskill made clear she is concerned about bases in Missouri.

“The impact BRAC has on our communities around the country, such as those surrounding my home state bases Fort Leonard Wood and Whiteman Air Force Base, is extraordinary,” said McCaskill, who is up for re-election in November. “I will not support a process that is callous or casual, or one that is rushed before we fully comprehend whether the traumatic task is clearly in the best interests of the American taxpayer and our national security. The department has a very long way to go before it proves to me that these initial criteria have been met.”



Privacy concerns arise over drones flying over U.S. skies

News Date: 24th March 2012

Source: GNA




Unmanned Aerial Vehicles (UAV’s), or “drones,” have been widely deployed by the United States in wars abroad and in fighting terrorists, but when they fly over the U.S. skies, the main concern is privacy.

The Conservative Action Alerts organization in the U.S. this week sent out mass e-mails to warn Americans of the danger of the invasion of such drones over the U.S. skies.

“The watchful eye of the U.S. government is about to be upon you, day and night. In 90 days … beginning in June … United States law enforcement will begin flying remote-controlled Unmanned Aerial Vehicles (UAV’s), or ‘drones,’ throughout America at altitudes under 400 feet,” says the warning.

“So, while you are putting burgers on the grill … while your kids are playing outside … while you walk your dog, or go window-shopping, or sit outside at a cafe, having lunch … the eyes of the U.S. government will be upon you,” the warning continues.

The conservative organization warned that with drones set to take over the skies across America, “the next victim of our new police state could very well be you.”

For their part, the law enforcement and intelligence defended their positions. The advantages of unmanned aircraft are obvious. Governments or aid groups could use them to coordinate disaster relief. Police could track fleeing suspects or search for missing people. Urban planners and first responders could monitor traffic jams, fires or floods in real time.

But the violation of civil liberties is also obvious when drone surveillance is widely adopted.

On how to safeguard privacy in this new era, Bloomberg View (from www. suggested that it will start by requiring police to obtain warrants for drone use that would violate reasonable privacy, except in clearly defined emergencies or to stop a crime in progress.

It also suggested that government agencies should notify the public of any continuous monitoring they plan and post a list of such programs online.

Also, personally identifiable data collected by drones, unless being part of a criminal investigation, should be subject to fair information practice principles.

Another issue is that private owners of drones should be required to obtain FAA approval for their aircraft and a license to operate it. In doing so, they should be alerted to potential violations of private-property rights and state privacy laws.

Since Predator drones first debuted over the battlefields in the Balkans in the 1990s, the United States has lost over 70 of the aircraft, but not a single pilot. No desperate rescue attempts are necessary. The U.S. military holds that drones have changed the essential equation of modern warfare by removing any chance of physical harm for the pilots.

Inspired by the success of drones abroad, the law enforcement, the government intelligence and even private companies want to have drones flying over the skies in the U.S. to carry out some specific tasks.

The U.S. Federal Aviation Administration has been tasked to open more U.S. airspace to unmanned drones. Legislation passed by the U.S. Congress in February this year requires the FAA to open up airspace to drones by Sept. 30, 2015.

But the first step comes much earlier, when police, firefighters and civilian first responder agencies can begin flying drones weighing no more than 4.4 pounds, NBC News reported.


According to provisions of the FAA bill signed by U.S. President Barack Obama in February, by May 2013 drones weighing less than 55 pounds can take to the air, although there are also technical issues that need to be dealt with, such as establishing the bandwidth for secure UAV radio communications and setting up collision avoidance systems.



UAV capabilities to be showcased at city conference

Sinclair symposium in April could spur development in region.

Dayton Daily News

By John Nolan, Staff Writer

Updated 11:08 PM Friday, March 23, 2012

DAYTON — Unmanned aircraft systems have the potential to be an $80 billion industry worldwide in the next decade, and the Dayton region has a chance to promote how it can support the industry with an upcoming conference.

The Ohio UAS Conference, organized by the Dayton Development Coalition for April 17-18 at Sinclair Community College, will bring together military, academic and industry officials as the Federal Aviation Administration begins a process to designate six test sites nationwide for flying the remotely piloted aircraft. Having designated airspace in the region for test-flying unmanned aircraft is critical, advocates said.

“It’s really where your (research and development) is going to happen. It’s your economic boom. That’s why there’s so much interest,” said Deb Norris, vice president for workforce development and corporate services at Sinclair, which is building its training curriculum for operators of unmanned planes.

Landing one of those sites is considered key to supporting research, development, training and manufacturing for unmanned aerial systems. That is a priority goal for the Dayton region, home to Wright-Patterson Air Force Base and supporting defense contractors, research universities and expertise in propulsion, sensors, advanced manufacturing and lightweight composite materials.

The Dayton Development Coalition and its former president, Jim Leftwich, are helping promote Dayton and the state as sites for aerospace investment. Leftwich, who also served as Ohio’s development director under Gov. John Kasich, is still serving as an adviser to Kasich.

Congress has directed the FAA to establish a pilot project within six months to create the test zones, which would become operative in 2013.

At least 25 states are expected to apply for those sites, which will allow the government to gather information about flights of the unmanned aircraft. The competition potentially includes North Dakota, New Mexico, Florida, Arizona and other states with either military bases or universities supporting unmanned aircraft operations or research, manufacturing, open spaces or coastline.

The University of New Mexico has an unmanned aircraft program and flight range. New Mexico is also home to Kirtland Air Force Base, Energy Department laboratories and open spaces.

North Dakota is home to a cluster of UAS expertise. The University of North Dakota is home to Palmer and other UAS research officials. The nonprofit Unmanned Applications Institute International, also based in Grand Forks, N.D., is a partner with Sinclair Community College in an agreement signed in 2011 to share research and training curriculums, pursue grants and bring in other partners as needed.

Grand Forks Air Force Base, N.D., hosts a Global Hawk unmanned surveillance aircraft mission. The North Dakota Air National Guard, which Palmer formerly commanded, is flying MQ-1 Predator unmanned aircraft in Afghanistan.

Arizona is home for the Army’s Fort Huachuca, site of a training program for pilots who remotely fly the unmanned aircraft. The U.S. Department of Homeland Security flies patrol drones along the state’s border with Mexico, and Arizona has manufacturers for unmanned planes and components.

The Dayton region has a host of supporting companies. Among them are UA Vision, which makes small unmanned aircraft at the Tech Town complex in Dayton; Woolpert Inc., a provider of aerial mapping and related services; and Beavercreek contractor Defense Research Associates Inc., which is using Army and Air Force research funding to develop camera and software systems that would allow unmanned aircraft to detect oncoming manned planes and change course to avoid collisions.

Of the 50 or so companies in the Miami Valley that develop some part of a UAV’s system, about 10 of those are in Springfield, said Tom Franzen, economic development officer for the City of Springfield.

The Springfield Air National Guard Base also is transitioning to a mission remotely piloting MQ-1 Predators, a military UAV.

Joe Zeis, chief strategic officer and executive vice president of the Dayton Development Coalition, refers to the other UAS states as partners, not rivals. Until the FAA sets standards for defining the test sites, it is too soon to regard others as competitors, he said.

Dayton advocates said it is premature to attempt to forecast how many jobs could be created in the region if it is granted an FAA test site.

Congress has ordered the FAA to fully integrate flying of the remotely piloted planes by Sept. 30, 2015, into airspace now reserved for manned aircraft, including resolving safety concerns about avoiding collisions. The planes can range from 4-pound versions suitable for launching by hand, to much larger aircraft with wide wingspans, and can hover and watch an area for hours, or a day, sending back video and other information.

Industry advocates say markets for the planes’ commercial uses — including agriculture, law enforcement, disaster response, aerial mapping and others — could grow much larger than the current military appetite for surveillance and attack drones.

Farmers could use unmanned aerial systems for aerial inspection of crops on large farms to improve efficiency in monitoring for drought damage or crop diseases, said Al Palmer, director of the Center for UAS Research, Education and Training at the University of North Dakota, which is researching agricultural uses and other potential markets.

Ducks Unlimited, the waterfowl conservation organization, has called the center to ask whether the unmanned planes could be used to count ducks, Palmer said.

Fire departments could send the planes through smoke from a chemical plant fire to sample and report back on toxic pollutants that firefighters could encounter. In a disaster zone, the unmanned planes could provide aerial reconnaissance and support radio communications in places where storms or other forces knocked out communications infrastructure, Palmer said.

The same surveillance capabilities of unmanned aerial systems that appeal to the military and civilian law enforcement have prompted warnings about how the “eyes in the sky” could spy on Americans at their homes. The American Civil Liberties Union is urging Congress and the FAA to mandate protections for the privacy of citizens.

“Domestic drones are potentially extremely powerful surveillance tools,” the ACLU said as Congress was completing work on the FAA test-site legislation. “That power, like all government power, needs to be subject to checks and balances.”

Contact this reporter at (937) 225-2242 or



Public not as anti-government as you might think

By Charles S. Clark

March 16, 2012


The general public is less hostile to government spending and to raising taxes than the average tea party enthusiast, but is assigning an increasingly higher priority to deficit reduction and supports such measures as freezing federal salaries, a leading polling analyst said Friday.

In a study of data from multiple national polls from the past two years, Michael Dimock, associate director of research for the Pew Research Center for People and the Press, offered a counterweight to some conventional readings of the nation’s mood on the question of “how much government do people want.”

Addressing a discussion organized by the National Academy of Public Administration, Dimock said the public’s preference for “smaller government providing fewer services” has inched upward since the 2008 election, to 48 percent, compared with only 41 percent who want a “bigger government providing more services.”

But taking a longer view, he added, those favoring smaller government declined in number since the mid-1990s from a high of 68 percent; those favoring bigger government grew from a low of 23 percent.

Asked about attitudes toward government pay and benefits, Dimock said the current federal salary freeze generally polled well. During times of economic uncertainty, he added, “most people have an impression that government jobs are good jobs, and their sense of fairness tells them that the uncertainty should be applied more equally.”

“The moment we’re in can hardly be called an extreme one” in terms of distrust, but the perception is that people turned against the government, Dimock said. The reasons for the disconnect include the rhetoric that ideological political strategists use in framing the results of the tumultuous 2010 elections and the fact that turnout for Republicans that year was 14 points higher than for Democrats. If one factors in the nonvoters’ preferences on the size and priorities of government, he said, the general public is less anti-government than the tea party.

Levels of trust in government, however, dropped steadily from just below 60 percent in 2002 to under 20 percent in 2010.

Exit polls from the largely anti-incumbent 2010 election show the public evenly split on the size of government, with 48 percent saying government is doing too many things and 43 percent saying government should do more to solve problems, the Pew data show.

But average Americans, like many politicians, are ambivalent about government and value many of the programs that would be cut under most deficit-cutting scenarios. The Pew data show the programs the public would most like to cut are “aid to the world’s needy,” military defense and unemployment aid. The programs least favored for cutting are education, veterans benefits, and Social Security and Medicare.

The biggest change in the past two years in priorities is in health care, where the number who would lower government spending rose 14 percentage points, the pollster said.

Not surprisingly, party affiliation affects views on spending priorities. Republicans are less likely to favor defense cuts and more likely to advocate reductions in environmental spending, for example. Democrats are more likely to favor new spending on roads and transportation and resist cuts to education.

The federal government fares the worst of the three levels of government in the public’s view, according to Pew data. Since 2002, the unfavorable rate for the federal government has risen from about 25 percent to 57 percent. At the state level, the unfavorable rate is 42 percent and for local governments only 32 percent. The biggest beef against the federal government is that it is “wasteful and inefficient” (70 percent agree this is a major problem), while only 46 percent agree with the assertion that government “interferes too much in people’s lives.”

Focusing on elected officials as opposed to agencies and workers, some three-fourths of respondents agree the government’s major problems include officials “not being careful with the government’s money, influenced by special interest money, care only about their own political careers, unwilling to work together and compromise, [and] out of touch with regular Americans.”

On regulation, a majority of 52 percent agree “regulation usually does more harm than good,” while only 40 percent believe regulation “is necessary to protect the public interest.” The sector the largest number believes is regulated too much is small business.

As for solutions to the fiscal crisis, “there’s actually a 50-50 divide among the American people” on whether reducing the deficit is more important than spending to help the economy recover, Dimock said. Reducing the deficit is an increasingly high priority among members of the public, if not as high as jobs creation.

But a clear majority, some 62 percent, say the solution lies in a combination of increasing taxes and cutting major programs. That contrasts with the political situation in Washington, which is deadlocked on that issue, Dimock noted.

Both in polls and in more extensive budget-cutting exercises in which researchers gauge public’s willingness to make tough trade-offs there is a surprising willingness to include tax increases in the solution — even among some tea party activists, according to Dimock and other research specialists who belong to NAPA.

Entitlement programs are “overwhelmingly viewed as good for the country,” the data show, and clear majorities want to retain current benefits. The government, however, gets only a fair-to-poor rating by the majority (56 percent) on how the programs are run. Clear majorities agree the finances of Social Security, Medicare and Medicaid are troubled and need major overhauls.

On taxes, the public appears focused on the fairness issue. Only 14 percent seemed concerned with paying large amounts in taxes themselves, while 51 percent expressed a “feeling that some wealthy people get away with not paying their share.” (Some 28 percent were most concerned about the tax code’s complexity). In a startling change, in a Pew poll taken in December 2011, only 38 percent felt they were paying more than their fair share of taxes compared with 55 percent in 2000, before the enactment of the George W. Bush tax cuts.

Finally, a sizable group of Americans concur they themselves are part of the problem in the budget stalemate, with 56 agreeing “Americans aren’t willing to pay for things they want the government to do.”


Apple to pay dividend, start stock buybacks as it breaks open $98B piggy bank

Washington Post

By Associated Press, Published: March 18 | Updated: Monday, March 19, 7:39 PM

NEW YORK — Apple is finally acknowledging that it has more money than it needs. But don’t expect it to cut prices on iPhones and iPads. Instead, the company said on Monday that it will reward its shareholders with a dividend and a stock buyback program.

Apple, the world’s most valuable publicly traded company, sits on $97.6 billion in cash and securities.

The company has stockpiled the cash through a combination of great ideas and prudence. Apple spends money, to be sure, building data centers, buying parts for its products and pursuing ambitious projects such as a new 2.8-million-square-foot headquarters that has been likened to a spaceship. It also invests in the research and development of new technology and negotiates an occasional acquisition.

But Apple simply hasn’t managed to spend its earnings faster than people are lining up to buy its iPads, iPhones and other gadgets.

The decision to return some of that money to investors is a clear signal that Apple is taking a different approach in the post-Jobs era.

Former CEO Steve Jobs resisted calls to issue dividends for years. He argued that the money was better used to give Apple maneuvering room to acquire other companies, for instance. Apple did pay a quarterly dividend between 1987 and 1995, but Jobs was not involved with the company at the time.

Jobs died in October after a long fight with cancer.

Since then, pressure had been mounting on new CEO Tim Cook. Apple’s ever-growing pile of cash was earning a paltry amount of interest and the fact that it was sitting there unused could have left the company open to charges of mismanagement and possible shareholder lawsuits.

On Monday, Cook said that, with as much cash as Apple has on hand, a dividend won’t restrain the company’s options.

“These decisions will not close any doors for us,” he told analysts and reporters on a conference call.

Indeed, Apple can afford it. The dividend, which should placate shareholders, will cost about $10 billion the first year. Apple generated $31 billion in cash in the fiscal year that ended in September and analysts expect it to add another $70 billion to $85 billion this year.

Apple said it will pay a quarterly dividend of $2.65 per share, starting in its fiscal fourth quarter, which begins July 1.

The dividend works out to $10.60 annually, or 1.8 percent of the current stock price. Although Microsoft Corp., pays 2.5 percent of its stock price in dividends, and Hewlett-Packard Co. pays 2 percent, analyst Tavis McCourt at Morgan Keegan said Apple’s dividend is relatively generous for a large technology company.

Energy and phone companies often pay dividends worth more than 5 percent of their stock price.

In absolute terms, Apple will pay one of the richest dividends in the U.S. The roughly $10 billion it will spend in its first year, places it just below companies including AT&T Inc. and Verizon Communications Inc., which are among the biggest spenders because they use dividends as their main way to attract investors.

Exxon Mobil Corp., the world’s second largest company by market capitalization, pays about $9 billion in dividends annually.

The dividend opens up ownership of Apple shares to a wider range of stock mutual funds, potentially boosting the stock price in the long term. Many “value-oriented” stock funds are not allowed to buy stocks that don’t pay dividends

Apple said the $10 billion share buyback program will begin next fiscal year, which starts Sept. 30, and runs for three years.

Investors had been expecting the announcement, driving Apple’s stock up 37 percent since management hinted in January that a dividend was in the works.

Buybacks are a popular alternative to dividends, since they reduce the number of shares outstanding. That means every remaining investor owns a larger share of the company.

In late morning trading, Apple’s stock rose $7.93, or 1.3 percent, to $593.50. Last week, the shares hit an all-time record of $600.01. Since Steve Jobs’ death on Oct. 5, Apple’s stock is up nearly 60 percent. The company is worth $553 billion.

McCourt raised his price target on Apple’s stock to $800 on Monday, becoming the first Wall Street analyst to do so. A dozen have price targets in the $700 range. He had been expecting the dividend, he said, and the main reason for the higher price target is the company’s tremendous profit growth.

The dividend and buyback announcement comes three days after the launch of Apple’s latest iPad tablet in the U.S. and nine other countries. Cook said sales the first few days set a record, but gave no details.

Cook said the company also considered splitting its stock and continues to look at that option. Stock splits increase the number of shares while reducing their value, potentially making it easier for small investors to buy them. But Cook said “there’s very little support” for the idea that stock splits help the stock overall.

Cook suggested that the dividend could have been larger if U.S. tax laws were different.

Cook said that as Apple analyzed how much it could give out to shareholders, it looked solely at the cash it has in the U.S. Like many big exporters, Apple has much of its cash overseas —some $64 billion, specifically.

Apple is reluctant to bring back overseas profits. In addition to being taxed in their respective countries, those profits would be subject to the 35 percent U.S. corporate tax rate.

“Current tax laws provide a considerable economic disincentive to U.S. companies that might otherwise repatriate a substantial amount of foreign cash,” Chief Financial Officer Peter Oppenheimer said.

Cook said Apple looked at how much domestic cash it had, then set aside enough for planned investments and unforeseen outlays. What was left over would be given out to shareholders, he said.

That suggests that if Apple could bring back its $64 billion in overseas money, the rewards to shareholders could be larger. Corporations have been clamoring for a change in tax laws, or a repeat of a 2004 tax amnesty on repatriated earnings.


U.S. War Game Sees Perils of Israeli Strike Against Iran



Published: March 19, 2012

WASHINGTON — A classified war simulation held this month to assess the repercussions of an Israeli attack on Iran forecasts that the strike would lead to a wider regional war, which could draw in the United States and leave hundreds of Americans dead, according to American officials.

The officials said the so-called war game was not designed as a rehearsal for American military action — and they emphasized that the exercise’s results were not the only possible outcome of a real-world conflict.

But the game has raised fears among top American planners that it may be impossible to preclude American involvement in any escalating confrontation with Iran, the officials said. In the debate among policy makers over the consequences of any Israeli attack, that reaction may give stronger voice to those in the White House, Pentagon and intelligence community who have warned that a strike could prove perilous for the United States.

The results of the war game were particularly troubling to Gen. James N. Mattis, who commands all American forces in the Middle East, Persian Gulf and Southwest Asia, according to officials who either participated in the Central Command exercise or who were briefed on the results and spoke on condition of anonymity because of its classified nature. When the exercise had concluded earlier this month, according to the officials, General Mattis told aides that an Israeli first strike would be likely to have dire consequences across the region and for United States forces there.

The two-week war game, called Internal Look, played out a narrative in which the United States found it was pulled into the conflict after Iranian missiles struck a Navy warship in the Persian Gulf, killing about 200 Americans, according to officials with knowledge of the exercise. The United States then retaliated by carrying out its own strikes on Iranian nuclear facilities.

The initial Israeli attack was assessed to have set back the Iranian nuclear program by roughly a year, and the subsequent American strikes did not slow the Iranian nuclear program by more than an additional two years. However, other Pentagon planners have said that America’s arsenal of long-range bombers, refueling aircraft and precision missiles could do far more damage to the Iranian nuclear program — if President Obama were to decide on a full-scale retaliation.

The exercise was designed specifically to test internal military communications and coordination among battle staffs in the Pentagon; in Tampa, Fla., where the headquarters of the Central Command is located; and in the Persian Gulf in the aftermath of an Israeli strike. But the exercise was written to assess a pressing, potential, real-world situation.

In the end, the war game reinforced to military officials the unpredictable and uncontrollable nature of a strike by Israel, and a counterstrike by Iran, the officials said.

American and Israeli intelligence services broadly agree on the progress Iran has made to enrich uranium. But they disagree on how much time there would be to prevent Iran from building a weapon if leaders in Tehran decided to go ahead with one.

With the Israelis saying publicly that the window to prevent Iran from building a nuclear bomb is closing,  American officials see an Israeli attack on Iran within the next year as a  possibility. They have said privately that they believe that Israel would probably give the United States little or no warning should Israeli officials make the decision to strike Iranian nuclear sites.

Officials said that, under the chain of events in the war game, Iran believed that Israel and the United States were partners in any strike against Iranian nuclear sites and therefore considered American military forces in the Persian Gulf as complicit in the attack. Iranian jets chased Israeli warplanes after the attack, and Iranians launched missiles at an American warship in the Persian Gulf, viewed as an act of war that allowed an American retaliation.

Internal Look has long been one of Central Command’s most significant planning exercises, and is carried out about twice a year to assess how the headquarters, its staff and command posts in the region would respond to various real-world situations.

Over the years, it has been used to prepare for various wars in the Middle East. According to the defense Web site, military planners during the cold war used Internal Look to prepare for a move by the Soviet Union to seize Iranian oil fields. The American war plan at the time called for the Pentagon to march nearly six Army divisions north from the Persian Gulf to the Zagros Mountains of Iran to blunt a Soviet attack.

In December 2002, Gen. Tommy R. Franks, who was the top officer at Central Command, used Internal Look to test the readiness of his units for the coming invasion of Iraq.

Many experts have predicted that Iran would try to carefully manage the escalation after an Israeli first strike in order to avoid giving the United States a rationale for attacking with its far superior forces. Thus, it might use proxies to set off car bombs in world capitals or funnel high explosives to insurgents in Afghanistan to attack American and NATO troops.

While using surrogates might, in the end, not be enough to hide Iran’s instigation of these attacks, the government in Tehran could at least publicly deny all responsibility.

Some military specialists in the United States and in Israel who have assessed the potential ramifications of an Israeli attack believe that the last thing Iran would want is a full-scale war on its territory. Thus, they argue that Iran would not directly strike American military targets, whether warships in the Persian Gulf or bases in the region.

Their analysis, however, also includes the broad caveat that it is impossible to know the internal thinking of the senior Iranian leadership, and is informed by the awareness that even the most detailed war games cannot predict how nations and their leaders will react in the heat of conflict.

Yet these specialists continue their work, saying that any insight on how the Iranians will react to an attack will help determine whether the Israelis carry out a strike — and what the American position will be if they do.

Israeli intelligence estimates, backed by academic studies, have cast doubt on the widespread assumption that a military strike on Iranian nuclear facilities would set off a catastrophic set of events like a regional conflagration, widespread acts of terrorism and sky-high oil prices.

“A war is no picnic,” Defense Minister Ehud Barak told Israel Radio in November. But if Israel feels itself forced into action, the retaliation would be bearable, he said. “There will not be 100,000 dead or 10,000 dead or 1,000 dead. The state of Israel will not be destroyed.”


What Hath Bell Labs Wrought? The Future

Book – ‘The Idea Factory,’ by Jon Gertner



Published: March 19, 2012

In today’s world of Apple, Google and Facebook, the name may not ring any bells for most readers, but for decades — from the 1920s through the 1980s — Bell Labs, the research and development wing of AT&T, was the most innovative scientific organization in the world. As Jon Gertner argues in his riveting new book, “The Idea Factory,” it was where the future was invented.

Indeed, Bell Labs was behind many of the innovations that have come to define modern life, including the transistor (the building block of all digital products), the laser, the silicon solar cell and the computer operating system called Unix (which would serve as the basis for a host of other computer languages). Bell Labs developed the first communications satellites, the first cellular telephone systems and the first fiber-optic cable systems.

The Bell Labs scientist Claude Elwood Shannon effectively founded the field of information theory, which would revolutionize thinking about communications; other Bell Labs researchers helped push the boundaries of physics, chemistry and mathematics, while defining new industrial processes like quality control.

In “The Idea Factory,” Mr. Gertner — an editor at Fast Company magazine and a writer for The New York Times Magazine — not only gives us spirited portraits of the scientists behind Bell Labs’ phenomenal success, but he also looks at the reasons that research organization became such a fount of innovation, laying the groundwork for the networked world we now live in.

It’s clear from this volume that the visionary leadership of the researcher turned executive Mervin Kelly played a large role in Bell Labs’ sense of mission and its ability to institutionalize the process of innovation so effectively. Kelly believed that an “institute of creative technology” needed a critical mass of talented scientists — whom he housed in a single building, where physicists, chemists, mathematicians and engineers were encouraged to exchange ideas — and he gave his researchers the time to pursue their own investigations “sometimes without concrete goals, for years on end.”

That freedom, of course, was predicated on the steady stream of revenue provided (in the years before the AT&T monopoly was broken up in the early 1980s) by the monthly bills paid by telephone subscribers, which allowed Bell Labs to function “much like a national laboratory.” Unlike, say, many Silicon Valley companies today, which need to keep an eye on quarterly reports, Bell Labs in its heyday could patiently search out what Mr. Gertner calls “new and fundamental ideas,” while using its immense engineering staff to “develop and perfect those ideas” — creating new products, then making them cheaper, more efficient and more durable.

Given the evolution of the digital world we inhabit today, Kelly’s prescience is stunning in retrospect. “He had predicted grand vistas for the postwar electronics industry even before the transistor,” Mr. Gertner writes. “He had also insisted that basic scientific research could translate into astounding computer and military applications, as well as miracles within the communications systems — ‘a telephone system of the future,’ as he had said in 1951, ‘much more like the biological systems of man’s brain and nervous system.’ ”

Mr. Gertner’s portraits of Kelly and the cadre of talented scientists who worked at Bell Labs are animated by a journalistic ability to make their discoveries and inventions utterly comprehensible — indeed, thrilling — to the lay reader. And they showcase, too, his novelistic sense of character and intuitive understanding of the odd ways in which clashing or compatible personalities can combine to foster intensely creative collaborations.

Mr. Gertner deftly puts these scientists’ work in the context of what was known at the time (and what would rapidly evolve from their initial discoveries in the decades since), even as he describes in remarkably lucid terms the steps by which one discovery led — sometimes by serendipity, sometimes by dogged work — to another, as well as the process by which ideas were turned by imaginative engineers into inventions and eventually into products that could be mass-produced.

Most notably, there’s the team that would win a Nobel Prize for its work on semiconductors and the transistor: the brilliant, aggressive physicist William Shockley (later to become infamous for his unscientific views on race), who “enjoyed finding a hanging thread so he could unravel a problem with a swift, magical pull”; the soft-spoken John Bardeen, who “was content to yank away steadfastly, tirelessly, pulling on various corners of a problem until the whole thing ripped open”; and Walter Brattain, “a skeptical and talkative experimentalist” who played extrovert to Bardeen’s introvert.

Restlessness and curiosity were traits shared by many of Bell Labs’ most creative staff members. Mr. Gertner describes John Robinson Pierce, father of the communications satellite, as an “instigator” who “had too many interests (airplanes, electronics, acoustics, telephony, psychology, philosophy, computers, music, language, writing, art) to focus on any single pursuit” but possessed a knack for pushing others to do their best work.

As for Shannon, the mathematician and engineer whose information theory laid the groundwork for telecommunications and the computer industry, he burned off excess energy by riding his unicycle up and down the long hallways of Bell Labs (sometimes juggling as he rode) and building whimsical machines like a primitive chess computer and an electronic mouse that could learn to navigate a maze, demonstrating the ability of a machine to remember.

Many Bell Labs scientists, including Brattain, Kelly and the Nobel Prize-winning physicist Charles H. Townes, who helped develop the principles of the laser, grew up on farms or in small towns, which Dr. Townes argued were the perfect “training grounds for experimental physics.” Such childhoods, he contended, taught a person how to “pay attention to the natural world, to work with machinery and to know how to solve practical problems and fix things innovatively, with what is on hand.”

Mr. Gertner nimbly captures the collegial atmosphere of Bell Labs and the mood of intellectual ferment — a blending of entrepreneurial zeal, academic inquiry and passion to achieve things that initially seemed technologically impossible — that suffused its New Jersey campuses.

The very success of Bell Labs, he notes, contained the seeds of its destruction. Not only was it producing too many ideas for a single company to handle, but some of its innovations (like the transistor) also altered the technological landscape so much that its core business would be reduced to a mere part of the ever-expanding field of information and electronic technology — a field increasingly dominated by new rivals, with which a post-monopoly AT&T had difficulty competing.

In addition, as a Bell Labs researcher named Andrew Odlyzko observed, the new business environment meant that “unfettered research” was no longer a logical or necessary investment for a company, which, in Mr. Gertner’s words, “could profit merely by pursuing an incremental strategy rather than a game-changing discovery or invention.”

AT&T’s original mission — to create and maintain a system of modern communications — has largely been fulfilled. And according to Mr. Gertner, the current Bell Labs president, Jeong Kim, believes that the future of communications may be defined by an industry yet to be created: a business that does not simply deliver or search out information, but also somehow manages and organizes the vast flood of data that threatens to overwhelm our lives.

The larger idea, Mr. Gertner concludes, is that “electronic communication is a miraculous development but it is also, in excess, a dehumanizing force. It proves Kelly’s belief that even as new technology solves one problem, it creates others.”


How bad would Iran be with the bomb?

By Walter Pincus, Published: March 19The Washington Post


Which would be worse if sanctions and diplomacy fail: the aftermath of an Israeli or U.S. attack on Iran to set back its nuclear program, or the Tehran regime having the bomb?

Of course, one hopes the sanctions/diplomacy route succeeds. But what if it doesn’t?

Iran said it has made advances in nuclear technology, citing new uranium enrichment centrifuges and domestically made reactor fuel. (Editors’ note: An earlier headline on this photo gallery failed to reflect debate over whether Iran is seeking a nuclear weapon.)

If you measure the level of public discussion, hands down the worst would be having Iran’s supreme leader, Ayatollah Ali Khamenei, and/or President Mahmoud Ahmadinejad armed with nuclear weapons.

However, within the intelli-gence community and among its retirees there are some ex-perienced analysts who believe that Iran’s leaders with nuclear weapons wouldn’t be much different than they are today, with their first concern being holding on to power, not using a weapon to wipe out Israel and thereby bring about their own destruction.

That approach has been sensibly argued by Paul Pillar, a former senior CIA intelligence analyst and a national intelligence officer for the Near East and South Asia from 2000 to 2005. He was deeply involved back then when internal doubts about Saddam Hussein’s weapons of mass destruction programs were low-keyed by CIA leaders and ignored by the George W. Bush White House.

“An Iran with a bomb would not be anywhere near as dangerous as most people assume, and a war to try to stop it from acquiring one would be less successful and far more costly than most people imagine,” Pillar writes in the current issue of Washington Monthly.

Pillar, who teaches at Georgetown University, points out that despite all the “belli-cosity and political rhetoric” about the issue, the idea of an Iran with the bomb “has been subjected to precious little careful analysis.” Conventional wisdom is that Tehran’s leaders would become more dangerous to their neighbors and the United States, Pillar states.

He cites the repeated stereotyping that Iran’s rulers are “religious fanatics who value martyrdom more than life, cannot be counted on to act rationally and, therefore, cannot be deterred.” Pillar notes that the past 30 years have proved that although they promote martyrdom to defend the homeland, “they have never given any indication of wanting to become martyrs themselves.”

Pillar says that since the 1979 revolution against Shah Mohammad Reza Pahlavi, the Islamic Republic of Iran has conducted a “cautious” policy toward the world. He acknowledges targeted assassinations in the 1980s and 1990s of exiled dissidents, but avoids mentioning Tehran’s anti-Americanism, its threats to Israel and its support of Hamas and Hezbollah, groups the United States and Israel consider terrorist organizations. He also fails to mention Iran’s military aid to dissident forces in Iraq.

Of course, Americans forget that the United States and Britain overthrew the popularly elected government of Iranian Prime Minister Mohammad Mosaddegh in August 1953 — something all Iranians remember. Americans also ignore Washington’s open policy of “regime change” in Tehran, promoted most pro-minently during the Bush years.

There is no lack of bitterness on both sides. That may prevent Americans from weighing Pillar’s cold analysis that “Iran’s rulers are constantly balancing a very worldly set of strategic interests” and from thinking “principles of deterrence are not invalid just because the party to be deterred wears a turban and a beard.”

There are two other possible dangers associated with Iranians having the bomb — they would arm terrorists, or they would feel shielded and become more generally aggressive. The Bush administration used the former to help build support for invading Iraq: Saddam Hussein would give a nuke to terrorists.

As the CIA argued in 2002 about Hussein, Pillar says Iran’s leaders have no incentive to lose control over a nuke. In Iran’s case, any use by terrorists would be traced to Tehran and bring swift retaliation. Tehran, he argues, would use nukes only in self-defense.

As for making Iran bolder in supporting terrorist groups, Pillar argues that Tehran’s main reason for obtaining the bomb is “in deterring aggression against one’s own country.”

Pillar also questions why the argument that any Israeli/U.S. attack on Iran to set back its nuclear program uses the “best case” scenario that Tehran’s response would be limited, while only a “worst case” analysis is made of Iran getting the bomb. If the armed attack by Israel or the United States is analyzed under “worst case” scenarios, Pillar says, “we would be hearing about a regional conflagration involving multiple U.S. allies, sucking in U.S. forces beyond the initial assault.”

He said such an attack also “would be an immediate political gift to Iranian hard-liners.”

An attack on Iran’s nuclear facilities of course would disrupt oil markets and raise gas prices. Look at what just the threat of such an attack is doing.

“War or a world with an Iranian bomb are not the only alternatives,” Pillar says. Talks are planned; diplomacy plus sanctions are still in play.

Even if Iran gets a bomb, “Israel would retain overwhelming military superiority with its own nuclear weapons — which international think tanks estimate to number at least 100 and possibly 200,” Pillar says. With its military assets, Israel “would continue to outclass by far anything Iran will have,” he concludes.



GOP budget plan includes pay freeze extension, workforce reduction

By Amanda Palleschi

March 20, 2012

Congressional Republicans proposed in their fiscal 2013 budget plan extending the federal pay freeze through 2015, cutting the size of the federal workforce by 10 percent and increasing employee contributions to retirement plans.

The plan, unveiled Tuesday by House Budget Committee Chairman Paul Ryan, R-Wis., claims that reforming federal retirements and cutting the size of the workforce would save taxpayers approximately $368 billion over 10 years.

“The reforms called for in this budget aim to slow the federal government’s unsustainable growth and reflect the growing frustration of workers across the country at the privileged rules enjoyed by government employees,” the proposal states. The plan describes the 10 percent reduction to the federal workforce as a “sensible attrition policy” to reduce bureaucracy without resorting to layoffs. It does not detail how Republicans would change federal employees’ retirement contributions, but “asks federal employees to make a more equitable contribution to their retirement plans.”

The Republican budget request is a response to President Obama’s budget proposal, which would end the two-year federal pay freeze by giving civilian workers a 0.5 percent pay hike in 2013. Under payroll tax cut legislation signed in February, employees with less than five years of federal service and hired after Dec. 31, 2012, will have to pay 2.3 percent more toward their pensions, making their total defined benefit plan contributions 3.1 percent of their pay.

According to a statement from the National Active and Retired Federal Employees Association, the $368 billion in cuts to federal compensation would cost employees more than half of what the payroll tax extension deal and the current federal pay freeze cost them combined.

Although the plan acknowledges the “important work” of the federal workforce, it cites figures that suggest public employee compensation outpaces that in the private sector and calls federal workers’ compensation “immune from the effects of the recession.”

The American Federation of Government Employees disagrees: “Federal employees already have had their pay frozen for two consecutive years, an unprecedented action that will save the government $60 billion over 10 years,” AFGE President John Gage said in a statement Tuesday. “It is fundamentally wrong for federal employees to be required, yet again, to serve as the automated teller machine for the nation. Enough is enough.”

National Treasury Employees Union President Colleen Kelley took issue with both the substance and rhetoric in the Republican proposal in her statement Tuesday.

The Republican-proposed reforms could make federal jobs less attractive and a smaller workforce could lead to “a substantial increase in the use of unaccountable private contractors and a much higher cost of providing services,” Kelley said.

“It is almost as though the authors of this budget don’t know, don’t understand or don’t care about the key role federal employees play in helping keep our nation safe, ensuring that our food and medicines are safe and effective, that our air and water are safe, and performing so many other services that people not only expect and want, but need as well,” she said.

The Ryan budget also came in for tough criticism at a Tuesday panel discussion at the Center for American Progress Action Fund. Gene Sperling, director of the White House National Economic Council, and Rep. Chris Van Hollen, D-Md., ranking member of the House Budget Committee, assailed the plan as “skewed toward the wealthy at the expense of the middle class” and society’s most vulnerable.

Asked about the proposed 10 percent cuts in the federal workforce, Van Hollen said, “We’ve consistently seen a Republican assault on the public servants who do the nation’s business every day, from the CIA professionals who found Osama bin Laden to the National Institutes of Health people who investigate cures for disease, to the Food and Drug Administration people who protect food safety.” He said Ryan’s latest plan would “take a hatchet across the board,” even though the size of the federal workforce as a percentage of the population has been reduced “and has become more efficient.”  The Republican strategy’s  “simple-sounding solutions,” Van Hollen added, would include cuts that would apply to Pentagon procurement specialists and auditors, and reducing them would be “a sure-fire way to get more-expensive contracts and more fraud and abuse.”

Sperling said he was also “disappointed” that Ryan’s plan proposes relieving the Defense Department from the Budget Control Act’s threat of sequestration just months after negotiations produced what both parties had agreed was an “offensive” threat of “mutually assured destruction.” Such a move, he said, would reverse for ideological reasons the one area of bipartisan consensus that is in the legislation and thus “actually create an incentive not to compromise.”

Charles S. Clark contributed to this story.


NSA head: Defense Department weathered ‘year of the hacker’


By Josh Smith, National Journal 03/20/2012

While 2011 could be considered the “year of the hacker,” government efforts prevented cyberattacks from having any major effect on Defense Department systems, Gen. Keith Alexander, head of U.S. Cyber Command, planned to tell a House subcommittee on Tuesday.

Still, cyberthreats remain a serious concern, Alexander said in testimony prepared for the House Armed Services Subcommittee on Emerging Threats and Capabilities.

“We believe it is only a matter of time before someone employs capabilities that could cause significant disruption to civilian or government networks and to our critical infrastructure here in the United States,” said Alexander, who oversees the National Security Agency.

With parts of the defense budget on the chopping block, officials have largely been successful in preventing cuts to cybersecurity.



New iPad: Consumer Reports says tablet runs hot

Washington Post

By Hayley Tsukayama

The heat of the new iPad tablet reaches temperatures of around 116 degrees — about 13 degrees warmer than the iPad2 — according to results of heat tests conducted by Consumer Reports that were released Tuesday.

The magazine found that the heat tended to be most concentrated on the lower left-hand corner of the tablet.

In its first look at Apple’s newest iPad released on Friday, Consumer Reports said that the tablet was “shaping up to be the best tablet we’ve ever tested, whether from Apple or any other manufacturer.” The short review went on to say that the display, graphics power and camera were all worthwhile upgrades to the tablet. “Apple has taken an excellent product and improved it significantly, if not dramatically.”

But chatter on Apple’s support forums and on tech blogs prompted Consumer Reports to release a statement saying that it would pay specific attention to the heat issue as it continued to review the iPad. The magazine famously withheld its “recommended” rating from the the iPhone 4 because of the “Antennagate” problem — the weak signal strength that users suffered when they held the iPhone in a certain way. The magazine has yet to say whether it will recommend the new iPad.

As users tried out the updated tablet over the weekend, several reports began to crop up that Apple’s hot new product was running a fever. Users have reported that the new iPad is noticeably warm to the touch after it’s been used for a while, unlike its predecessors. Others have noted that the heat seems to be concentrated in a “warm corner,” to the left of the home button. In some cases, users said it was so hot that it was uncomfortable to hold.

Consumer Reports reviewer Donna Tapellini said that in tests even when the tablet was “at its hottest, it felt very warm but not especially uncomfortable” when held for a short period of time. Tapellini said that the iPad seems to be a bit hotter when it’s plugged in.

Apple spokeswoman Natalie Harrison said that anyone who is concerned about the heat from their iPad should let the company know.

“The new iPad delivers a stunning retina display, a5x chip, supports 4G LTE plus 10 hours of battery life all while operating well within our thermal specifications,” Harrison said. “If customers have any concerns, they should contact AppleCare.”

The health effects of high electronics use has been an area of concern in the past. A 2004 study suggested that men who take “laptop” literally could be damaging their fertility, since heat has a negative effect on sperm production. In 2010, another study linked laptop use to “toasted skin syndrome,” a mottled-skin pattern that appears on the thighs of users who keep their computers on their legs for too long. Cellphone radiation has been a perennial concern. The city of San Francisco even passed an ordinance — ultimately removed— that required retailers to post cellphone radiation levels. In May 2011, the World Health Organization said that cellphones are “possibly carcinogenic.”



White House might finally go wireless


By Bob Brewin 03/22/2012


President Obama and the White House staff can tap into some of the most sophisticated communications systems in the world, but not the wireless networks found in many U.S. households — a technology gap the Defense Information Systems Agency wants to fix.
DISA, in a notice to industry posted on the Federal Business Opportunities website on Wednesday, said the White House Communications Agency, which it manages, is looking for a White House campus Wi-Fi network. The network would serve 60 buildings, along with a half mile of three roads next to the White House and several outdoor areas.

The detailed sources sought notice said the lack of a Wi-Fi network has affected the daily operations of White House personnel who have “embraced the use of a mobile virtual architecture.”

The staffs of the president, vice president, National Security Council and Secret Service “operate in a mobile environment away from more traditional land line/fixed cable infrastructure,” the sources sought notice stated. “In order to fulfill this mission WHCA must provide wireless connectivity in fixed, mobile and temporary environments.”

DISA said the White House campus network should support access to Defense Department unclassified networks and the commercial Internet, with both services transmitted from the same access points using separate service set identifiers, which broadcast the name of each network.

The planned Wi-Fi environment should allow White House staffers to move from one access point to another without losing a signal, DISA said.

The agency specified the latest Wi-Fi standard, 802.11n, which has a raw data rate of 600 megabits a second and backward compatibility with the older 802.11 a/b/g standards.

The 802.11 a and g standards provide maximum data throughput of 54 megabits per second and the 802.11b standard transmits data at a rate of 11 megabits per second. The White House Wi-Fi network needs to operate in the unlicensed 5-GHz frequency band that access points and clients on the 802.11a standard use as well as the 2.4-GHz frequency band for 802.11 b/g devices, DISA said.

Howard Robertson, federal sales manager for Wi-Fi access point vendor Tropos Networks, said his company has supplied this kind of campus network to corporations. He estimated it would take about 135 access points to meet the White House coverage requirements — two access points per building for a total of 120, plus another 15 for outdoor coverage


Most 2011 cyberattacks were avoidable, Verizon says

Despite all the hype about sophisticated attack methods, 97% could have been stopped using fundamental precautions

Jaikumar Vijayan

March 22, 2012 (Computerworld)

Despite rising concern that cyberattacks are becoming increasingly sophisticated, hackers used relatively simple methods 97% of data breaches in 2011, according to a report compiled by Verizon.

The annual Verizon report on data breaches, released Thursday, also found that in a vast majority of attacks (80%), hackers hit victims of opportunity rather than companies they sought out.

The findings suggest that while companies are spending increasing sums of money on sophisticated new security controls, they are also continuing to overlook fundamental security precautions.

The conclusions in the Verizon report are based on the investigations into more than 850 data breaches. The report was compiled with the help of the U.S. Secret Service and law enforcement agencies in the United Kingdom, The Netherlands, Ireland and Australia, Verizon said.

Verizon said it found that attacks by so-called “hactivist” groups such as Anonymous for the first time compromised more breached records — more than 100 million — than the number of attacks by hackers specifically looking to steal financial or personal data.

Data breach victims and security vendors generally tend to describe attacks as highly sophisticated and involving a great deal of expertise on the part of hackers.

The Verizon report though shows a far more mundane reality.

Most of the breaches didn’t require hackers to possess special skills or resources, or to do much customization work. In fact, Verizon said that 96% of the attacks “were not highly difficult” for the hackers.

“Additionally, 97% were avoidable, without the need for organizations to resort to difficult or expensive countermeasures,” the report said.

Very often, the companies breached had no firewalls, had ports open to the Internet or used default or easily guessable passwords, said Marc Spitler a Verizon security analyst.

The study found that cybercriminals did not have to work any harder to break into a large organization than into a small one.

Attackers in 2011 generally didn’t need new sophisticated tools to break into most organizations, Spitler said.

“We have seen nothing new. Some of the old standbys are continuing to work very well for the people going after information,” he said. “Not enough has been done to raise the bar and to force them to spend” significant sums on new tools and exploits.

The most sophistication found by the researchers was in the methods used by attackers to steal data after breaking in to systems, he said.

Attackers typically have installed malware on a victim company’s network to escalate privileges, set up backdoors, enable remote control and sniff out sensitive data. Many take steps to remain hidden on the network for a long time and then wipe their tracks when they are done.

Such tasks require moderate to advanced skills and extensive resources on the part of the attackers, according to Spitler. “That is one area where we have raised the bar,” he said.

Most of the targeted attacks last year were directed large companies in the finance and insurance industries, according to Verizon.

Hackers, often part of organized groups, used large-scale automated methods to find vulnerable businesses to exploit.

In such cases, more than 85% of victim companies employed less than 1,000 employees and were mostly in the retail, hospitality and food services industries.

The findings once again highlight the need for companies to pay attention to security basics, Spitler said.

“It is about going back to basic security principles. A lot of the same recommendations we have used in past years, we have recommended this year,” he said.

Jaikumar Vijayan covers data security and privacy issues, financial services security and e-voting for Computerworld. Follow Jaikumar on Twitter at @jaivijayan, or subscribe to Jaikumar’s RSS feed . His e-mail address is


Congress approves bill requiring feds to post financial statements online

Federal Times

By SEAN REILLY | Last Updated:March 22, 2012


Congress on Thursday gave final approval to legislation requiring the financial disclosure reports of some 28,000 federal officials to be posted on the Internet by August.

The Senate voted 96-3 to send the Stop Trading on Congressional Knowledge (STOCK) Act to President Obama for his signature. The measure passed the House last month.

While the bill’s main purpose is to ensure that lawmakers and their staffs don’t use non-public insider knowledge when dealing in stocks and other investments, it also mandates posting the disclosure statements — known as Office of Government Ethics (OGE) Form 278s — on a public website. Some 28,000 filers, including presidential appointees, Senior Executive Service members, and general and flag officers, would fall under the law, according to a recent analysis by the ethics office. Although the annual statements are already public, they are typically available only in paper form in response to a request.

Once Obama signs the bill, agencies must put the disclosure statements online on their own sites, the analysis said. Eventually, the bill requires the ethics office to create a searchable database on its site.

The ethics office last week posted online the disclosure reports for some 900 presidential appointees and nominees requiring Senate confirmation, as well as presidential candidates. But the analysis says the cost of an electronic filing system to meet the bill’s requirements would exceed the ethics office’s entire annual budget of about $14 million.




LTE option poses data dilemma for iPad, smartphone users

Costs of using fast cellular networks push users to Wi-Fi

Matt Hamblen

March 23, 2012 (Computerworld)

The new iPad’s LTE option, which allows access to fast 4G networks, has also shocked some customers who found they can eat up an entire month’s worth of data watching just a couple hours of streaming video.

For a long time, analysts and even carriers have urged customers to download videos and other large files over Wi-Fi to avoid the high price of using a cellular connection.

But that hasn’t stopped owners of the new iPad and some recent LTE Android-based smartphones like the Samsung Galaxy Nexus from leveling renewed criticism at carriers like AT&T and Verizon Wireless for imposing high data fees.

In December, Computerworld illustrated the problem by downloading a single 128-minute HD movie over LTE to a Galaxy Nexus. The 3.7GB movie file took less than two hours to download .

The new iPad adds a whole new level of temptation, since it features a high-resolution 9.7-in. “Retina” display with 3.1 million pixels, making it a great way to view everything from movies to online games to televised sports. A fast LTE network offers a consistent connection for streaming video and faster download capabilities than 3G networks.

Data price plans for tablet users start at $30 a month for 3GB of data on AT&T or 2GB on Verizon. Using that pricing model, a single HD movie download would cost $50 over Verizon (the $30 plan for 2GB, with two $10-per-GB overage charges) or $40 on AT&T.

As a result, some customers are asking: Why have a great new smartphone or tablet running over LTE if you can only watch video for such a short amount of time?

“People aren’t going to stand around forever and just deal with limitations on data usage, and it’s not incumbent on the user to monitor his own bandwidth consumption,” wrote Erik Fecher in a comment to a Computerworld story. “It’s completely the responsibility of the service provider to accommodate it.”

The Wall Street Journal this week also found new iPad customers who quickly zipped through their respective data plans, with the issue generating similar comments.

Neither Verizon or AT&T wanted to comment about data usage costs. However Verizon did offer up a number of “tips” via email for consumers seeking “to manage their data allow allowance on…any mobile device.”

The Verizon tips, not surprisingly, include using “Wi-Fi to help extend the life of your data plan” at home or elsewhere. “Checking email is not a huge use of data, but streaming video is, so you may want to use Wi-Fi when streaming video,” Verizon said.

The carrier also has a data calculator, and urges users to download a data widget for Verizon LTE tablets running Android. The widget is currently not available for the iPad.

The wireless industry clearly sees the need to protect their networks.

“Carriers need to keep the data usage rates fairly high to avoid being overwhelmed by the users of these these content-crazy devices,” said Jack Gold, an analyst for J. Gold Associates. “So I wouldn’t expect carriers to be making data more attractive anytime soon.”

Rob Enderle, an analyst at Enderle Group, said the “core of the problem is under-capacity of networks…. If you are a carrier basically selling more [bandwidth] than you have, you aren’t really motivated to reduce prices, because that would drive up demand and you already don’t have enough.”

Enderle, Gold and other analysts said many consumers already avoid buying LTE and 3G tablets to keep their costs low, resorting to Wi-Fi-only devices.

Just one in 10 tablets sold uses a cellular connection, analyst Chetan Sharma reported this week. Many tablet owners already have a smartphone and don’t want to pay for both services, he noted.

The controversy has raised a number of possible scenarios for carriers. One approach would be to set up family and company group plans in which 3G or 4G wireless data use for a given month is shared across a group of people and their various devices.

“I do expect to see data sharing plans [from carriers] by the end of the year or early next year,” Gold said.

Another approach would wrap the data costs of an online app, movie or sporting event into the overall cost of the app. Carriers could meter this data by using technology similar to 800 calling services where the vendor pays for the data. On-demand wireless video services are already emerging that include the cost of data.

Currently there’s not a mechanism to allow an app provider to include the cost of data usage when selling an app, Enderle said. But Amazon’s early e-reader devices — not the Kindle Fire — do allow users to download a book with the wireless download cost bundled with the book. “Eventually, some apps will have connectivity as part of the package,” he predicted.

A third approach is emerging: cellular-to-Wi-Fi roaming technology, which would make it seamless for a tablet user on 3G or LTE to roam onto a Wi-Fi hotspot.

Carriers aren’t expected to allow their customers to roam to Wi-Fi from cellular totally free, but it isn’t clear how much they would charge, analysts said.

The Hotspot 2.0 technology initiative has begun a second round of testing smartphones, tablets and laptops with embedded software that handle secure and seamless cellular-to-Wi-Fi handoffs, said Niels Jonker, CTO of Boingo Wireless. Boingo runs hotspots in many airports and sits on the board of the Wireless Broadband Alliance, which is promoting the Hotspot 2.0 initiative.

Jonker said software updates for existing phones and software embedded in new phones will be available in the early part of 2013. “The time to mass adoption is pretty close, not five years off,” he said.

The handoff from cellular to Wi-Fi with Hotspot 2.0 would work over smartphones and tablets and laptops and “would be fully automatic and part of a customer’s plan,” he said.

“The cellular carriers are very much into this, and are motivated,” he added. “They understand full well that there’s no way they can service a crowded place like a stadium of full of people taking and sending photos at once in any other way [than with Wi-Fi] and even LTE won’t scale to do that for them.”

Jonker said it is ironic how the large carriers describe LTE as offering so much more overall network capacity, but continue to testify to the Federal Communications Commission about tehir need for more spectrum. While Cisco has estimated there will be a 27-fold increase in wireless data growth in the next two years, LTE is expected to only add a five-fold increase over the prior wireless network capacity.

“We need a bigger solution such as Wi-Fi offloading, and there’s not a carrier not on board with that,” Jonker said.

As to whether Wi-Fi offloading could potentially cut into carrier revenues, Jonker said that the biggest economic consideration for carriers should be to “keep their users happy. And they can’t build out their [cellular] networks cheaply enough to do that.”

Matt Hamblen covers mobile and wireless, smartphones and other handhelds, and wireless networking for Computerworld. Follow Matt on Twitter at @matthamblen or subscribe to Matt’s RSS feed. His email address is



Historical Data on Price of Gasoline & Oil


USA Average Retail Price 2004-2012





USA Average Retail Price 2011- 2012


Dayton OH Average Retail Price 2004- 2012





Dayton OH Average Retail Price 2011- 2012




From → Uncategorized

Leave a Comment

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )


Connecting to %s

%d bloggers like this: