Skip to content

July 13 2013




Snowden affair clouds U.S. attempts to press China to curb cyber theft


Mon, Jul 8 2013

By Paul Eckert


WASHINGTON (Reuters) – Revelations by former U.S. spy agency contractor Edward Snowden will make it harder for the United States to confront China at talks this week over the alleged cyber theft of trade secrets worth hundreds of billions of dollars each year.

Snowden’s disclosures of American electronic surveillance around the world give China an argument to counter U.S. complaints that it steals private intellectual property (IP) from U.S. companies and research centers.

Cyber security is at the center of high-level meetings between the two countries in Washington that will show whether a positive tone struck by President Barack Obama and new Chinese President Xi Jinping at a summit last month can translate into cooperation on difficult issues.

Top U.S. officials from Obama down have long tried to convince China to recognize a clear line between the kind of cyber espionage by spy agencies revealed by Snowden and the stealing of technology.

“This Snowden thing has muddied the waters in a terrible way,” said James McGregor, author of a book on China’s authoritarian capitalism and industrial policy.

“China would rather have the waters muddy, because they can say ‘You do it. We do it. What’s the big deal?’ and the cyber theft against companies will go on and on,” he said by telephone from China, where he is senior counselor for APCO Worldwide, a U.S. business consultancy.

Treasury Secretary Jack Lew said last week that U.S. officials will press China at the talks on cyber theft, a problem he described as “just different from other kinds of issues in the cyber area.

Many countries spy on each other, but U.S. officials say China is unique in the amount of state-sponsored IP theft it carries out as it tries to catch up with the United States in economic power and technological prowess.

Last week the U.S. Department of Justice charged Chinese wind turbine maker Sinovel Wind Group Co and two of its employees with stealing software source coding from U.S.-based AMSC in an alleged theft worth $800 million.

The U.S. Chamber of Commerce hopes “to see a clear indication that China recognizes thefts of trade secrets, whether by cyber or other means, is stealing property and will bring the full force of its laws to curb this,” said Jeremie Waterman, the group’s senior director for Greater China.

Beijing regularly parries complaints about Chinese hacking into the computers of U.S. businesses by saying that China is itself a major victim of cyber espionage. Chinese officials have dismissed as unconvincing recent U.S. official and private-sector reports attributing large-scale hacking of American networks to China.

China’s official Xinhua news agency last month said the Snowden case showed the United States was “the biggest villain in our age” and a hypocrite for complaining about Chinese cyber attacks.


China’s stance appears to be bolstered by Snowden’s revelations of widespread surveillance by the National Security Agency and his assertion that the agency hacked into critical network infrastructure at universities in China and in Hong Kong.

Snowden first fled to Hong Kong before his leaks to newspapers became public last month, and has subsequently gone to Moscow. He is believed to be holed up in the transit area of the city’s Sheremetyevo International Airport and has been trying to find a country that would give him sanctuary.



Now in their fifth year, the annual U.S.-Chinese talks, known as the Strategic and Economic Dialogue, will cover topics from U.S. concerns about North Korea’s nuclear weapons and expanding U.S.-China military ties to climate change and access to Chinese financial markets.

China’s exchange-rate policy is on the agenda, although it has receded as a issue with the gradual strengthening of the yuan and a reduction of huge current account imbalances.

This year Secretary of State John Kerry and Lew host Chinese State Councilor Yang Jiechi and Vice Premier Wang Yang for the first S&ED session since China’s once-a-decade leadership change in March when Xi took over.

The meetings follow Obama’s summit last month with Xi in California, where the two men developed what aides called a productive relationship. Nevertheless, Obama demanded Chinese action to halt what he called “out of bounds” cyber spying.

Civilian and military officials from the two countries discussed international law and practices in cyberspace at low-level talks on Monday. Cyber security is due to come up at other meetings throughout the week that will also likely address U.S. accusations that Beijing gained access electronically to Pentagon weapons designs.

IP theft costs U.S. businesses $320 billion a year, equal to the annual worth of U.S. exports to Asia, authors of a recent report say.

A bipartisan group of high-ranking former U.S. officials known as the Commission on the Theft of American Intellectual Property said in a May report that China accounts for between 50 percent and 80 percent of IP theft suffered by U.S. firms.

Cyber theft of industrial designs, business strategies and trade secrets is only a portion of IP pilfering.

IP theft more commonly involves “planted employees, bribed employees, employees who were appealed to on the basis of nationalism and all the traditional means of espionage, often accompanied by cyber,” said Richard Ellings, president of the National Bureau of Asian Research think tank, who co-wrote the report.

The U.S. District Court in Manhattan charged three New York University researchers in May with conspiring to take bribes from Chinese medical and research outfits for details about NYU research into magnetic resonance imaging technology.

Arrests by U.S. Immigration and Customs Enforcement and the Homeland Security Department for IP infringements rose 159 percent and indictments increased 264 percent from 2009-13, according to a report released in June by the U.S. Intellectual Property Enforcement Coordinator.

The Commission on the Theft of American Intellectual Property called for tough penalties including banking sanctions, bans on imports and blacklisting in U.S. financial markets.



Special Report: Cyber Priorities

Snowden Incident Returns Spotlight to Employee Danger


Defense News

Jul. 9, 2013 – 06:00AM |



WASHINGTON — Edward Snowden, the leaker currently stuck in Russia who disclosed a wide range of secrets about US government surveillance and spying, has changed the conversation about cybersecurity. Not because of the documents he released, but as a reminder of the vulnerability organizations have to the threat of insiders with access to large swathes of information and system components.

It’s a lesson that was the talk of the cyber community following the WikiLeaks disclosures through the alleged actions of Bradley Manning that faded as experts began to focus on the growing threat of foreign governments, particu­larly China. It is back in vogue because of the volume and sensitivity of information Snowden has made public.

Some of the fallout from the Manning case, such as the banning of thumb drives and other external media from sensitive systems, has been walked back in some instances in the name of practicality. One of the problems, as is the case with any security issue, is you can’t make a network truly safe from an insider.

“It’s akin almost to insider attacks in Afghanistan,” Army Gen. Martin Dempsey, chairman of the US Joint Chiefs of Staff, said during a late June speech. “Well, the answer is that you can’t prevent it. You can mitigate the risk, and what I’d like you to take away from this conversation about the incident with Snowden is you can’t stop someone from breaking the law 100 percent of the time. You just can’t stop that from happening.”

Dempsey did, however, suggest steps to reduce the threat of insiders to Defense Department networks, including cutting the number of people in positions like Snowden’s.

“I think systems administrators is the right place to begin to clean this up because they have such ubiquitous access, and that’s how he ended up doing what he did,” he said. “We really need to take advantage of thin client and cloud technology, to dramatically reduce the number of systems administrators that we have managing programs, which will make it both more effective and safer.”

That approach carries risk because fewer individuals will have access concentrated in their hands, said Jeff Moulton, director of information operations at Georgia Tech Research Institute.

“What they’ve done now is rather than mitigating the threat, they’ve increased the likelihood of a catastrophic impact from a threat,” he said. “It’s not going to help. It introduces other problems, like the broader access of the cloud.”

One idea suggested by several cyber experts, including Moulton, is to adopt nuclear launch security as a guide. When it comes to the use of nuclear weapons, two separate individuals have to provide authentication before a weapon can be used. Not only does this prevent accidents, but it guarantees that a second person will be monitoring the activity of the first.

In the cyber realm, this could be achieved by requiring two people to provide their security credentials before either could access certain kinds of documents or segments of the network control system.

“Is it time consuming? Perhaps,” Moulton said. “But what’s more time consuming, doing this or armchair quarterbacking?”

Still, there will always be a residual threat from insiders, which is why deterrence is key, said Ian Wallace, a visiting fellow with the Brookings Institution and a former official with the British Ministry of Defence.

“The insider threat will always exist, and it will be next to impossible to stop it completely,” Wallace said. “But there are also plenty of ways in which that can be deterred. Not the least of those is the traditional deterrent of getting caught and prosecuted, something which is even more likely with the emergence of companies doing big data analysis of behavior on their own systems.”

Wallace cautioned that all of this attention on the insider threat may be misguided. Statistically, insider attacks are exceedingly rare, even if the data that is lost or the risk to systems from a determined insider is significant.

“All of the evidence that I have heard from the best cybersecurity firms suggests that the main threat is still the remote threat, for three compelling reasons: the risk of being caught is much less, it is much more scalable, and at present it is still, sadly, relatively easy for a sophisticated and determined intruder to get into all but the best protected systems,” Wallace said.

In the hunt for solutions to the insider threat, one of the big questions is how to detect intent from an employee ahead of a problem. In much the same way that concerns have surfaced about what radicalized the Boston bombing suspects and whether it could have been detected earlier, experts are studying how to discover the intentions of insider threats sooner.

That can take the form of such mundane facts as the speed at which an employee types. Changes in the rate of typing can indicate mood, a tip that further inquiry might be needed.

But to gain that type of data, a certain degree of invasiveness is required, and some superficial profiling of behavior is employed.

That creates all kinds of legal and ethical questions but may be a necessity for large organizations with many people to monitor, Moulton said.

“You can’t monitor everybody all the time,” he said. “Look at what the casinos do. They profile, but that’s a really difficult word. Are we prepared to profile?”

Dempsey emphasized that some actions would be taken to improve the system, but he described a certain degree of risk acceptance.

“You can certainly increase the scrutiny in terms of their background investigations, you can reduce the number of them you get, there are different degrees of oversight in place,” he said. “But at some point, if somebody is going to break the law and commit an act of treason, I don’t know what he’ll eventually be charged with or espionage, they’re going to be able to do that.”



DOD building its own secure 4G wireless network

By Kathleen Hickey

Jul 03, 2013


The Defense Department expects to have its own secure 4G wireless network up and running by the middle of next year, hosting a variety of iPhones, iPads and Android devices.

The network is part of DOD’s four-year, $23 billion dollar investment in cybersecurity, which also calls for hiring an additional 4,000 people for its cyber workforce, establishing common standards and improving coordination in investing and managing cyber resources, Gen. Martin Dempsey, chairman of the U.S. Joint Chiefs of Staff, said in a recent speech given at the Brookings Institution.

Dempsey said he had a secure mobile phone that “would make both Batman and James Bond jealous.”

Dempsey also spoke about creating a federal app store using off-the-shelf technology to “allow any DOD user to write and share phone and tablet apps.” On June 28, the Defense Information Systems Agency announced it awarded Digital Management, Inc. a $16 million contract to build the DOD’s first enterprisewide mobile application store and mobile device management system.

The secure 4G network is part of the DOD’s Joint Information Environment initiative to consolidate its 15,000 networks into a cloud environment.

“The new Joint Information Environment will deepen collaboration across the services and mission areas. It will also be significantly more secure, helping ensure the integrity of our battle systems in the face of disruption,” said Dempsey.

A few news outlets, such as TechInvestorNews, speculated whether the network was a ploy by DOD to exclude itself from the National Security Agency’s surveillance program, since its calls would not go through Verizon or other commercial carriers from which NSA collects metadata.

But the network could also just be a sign of DOD recognizing the growing importance of mobile computing. The military has long had its own non-classified and classified IP networks — NIPRnet and SIPRnet. As it uses more smart phones and tablets, that approach to security is extending to mobile.

Since Dempsey was appointed chairman in 2011, critical infrastructure attacks have increased 17-fold, he said at Brookings, although he did not specify the exact number of attacks, nor how many occurred prior to his taking office.

“Cyber has escalated from an issue of moderate concern to one of the most serious threats to our national security,” he said. And in addition to military systems, securing civilian infrastructure and businesses, such as those in the banking, chemical, electrical, water and transport sectors, is vitally important.

“Although we have made significant progress embracing cyber within the military, our nation’s effort to protect civilian critical infrastructure is lagging,” Dempsey said. “Too few companies have invested adequately in cybersecurity.”

“One of the most important ways we can strengthen cybersecurity across the private sector is by sharing threat information. Right now, threat information primarily runs in one direction — from the government to operators of critical infrastructure. Very little information flows back to the government,” he said. “This must change. We can’t stop an attack we can’t see.”


Commentary: Can Driverless Cars Save the Postal Service?

By Samra Kasim and Matt Caccavale

July 5, 2013

Ding! That sound could soon be the USPS app alerting you to an imminent delivery, after which a driverless Postal Service vehicle arrives at your door and a robotic arm delivers your package.

While this may sound like science fiction, driverless vehicles will be coming to streets near you sooner than you may think. Sixteen states already have introduced driverless vehicle legislation and California, Nevada, Florida, and the District of Columbia have enacted laws allowing driverless vehicles on their roads. Sergey Brin, co-founder of Google and a driverless vehicle advocate, forecasts fully autonomous vehicles will be available for sale in five years.

Driverless vehicles have the potential to transform many enterprises focused on transporting goods. The Postal Service’s fleet of 215,000 vehicles traveled over 1.3 billion miles in 2012, roughly equivalent to circumnavigating the globe 172 times every business day. Driverless vehicles could reduce operating costs through increased safety, fuel efficiency, and new business models. After posting a quarterly loss of $1.9 billion in May, it’s time for USPS to explore reinvention.

Think about what a day in the life of a USPS driverless vehicle might look like:

12:18 a.m. The latest software package with updated mapping information and the day’s optimized delivery route is downloaded directly from fleet headquarters.

12:30 a.m. The vehicle begins delivery on its suburban route — the pre-determined optimal time for mail delivery on that particular day.

5:00 a.m. A local bakery’s two-hour reservation through USPS’s CloudCar program begins and the vehicle delivers bread to grocers around town. Since the bakery owner no longer has to maintain his own fleet of delivery trucks, he can hire two more bakers and double production.  

7:22 a.m. The vehicle stops at a full service gas station, refuels and reports a maintenance diagnostic assessment to fleet headquarters, allowing USPS to forecast maintenance requirements and plan accordingly.

11:13 a.m. After completing initial deliveries, the car is identified as available. Just then, a business executive pulls up the USPS mobile app on her phone, checks-in at her current location and orders a rush delivery of a time sensitive document.

3:15 p.m. While en route, the car’s sensors detect a large pothole, triggering an automatic report to the local transportation department with geotagged images of the hazard.

4:18 p.m. A businessman suddenly remembers that today is his anniversary. He places an order at a local florist, who has an urgent delivery contract with USPS’s new dynamic pricing system. The vehicle stops at the florist and is then routed to the spouse’s residence.

7:14 p.m. After completing its custom delivery orders and returning to the USPS regional warehouse, the vehicle sends its daily diagnostic report to fleet headquarters, and begins the next round of deliveries.

While this is only a thought experiment, the potential for new operating models and cost savings is very real.

Removing the driver from a vehicle enables it to be used around-the-clock. Routes could be designed around optimal traffic patterns and delivery needs. Driverless vehicles also could be used as a shared service with other businesses and government agencies leasing time when the vehicles are available, similar to the Uber Taxi model. With its significant vehicle fleet and 42,000 ZIP code reach, the Postal Service is well positioned to pilot new service models. It could, for instance, coordinate with auto manufacturers and the State of California to test the readiness of its highways for driverless cars.

Driverless vehicles also have the potential to reduce vehicle operating costs. In 2012, Google reported that after driving 300,000 miles, its driverless cars were not involved in any accidents. Computer control of cars mitigates against human error, such as fatigue or distraction, leading to greater safety. Vehicle accidents and wear-and-tear create significant operating costs for large enterprises like USPS. In FY 2011 alone, USPS had over 20,000 motor vehicles accidents. According to OSHA, the average vehicle crash costs an employer $16,500. The average cost skyrockets to $74,000 when an employee has an on-the-job crash resulting in injury. With fewer vehicle-related accidents, USPS could see substantial cost savings.

As gas prices continue to climb, fuel is another major cost for large fleet operators. The Postal Service spent nearly $500 million in 2011 and required $614 million in maintenance. With an average vehicle age of 16 years, fuel and maintenance costs will continue to climb. A Columbia University study identified that “cars simply managing their own speed would increase efficiency by an appreciable 43 percent.”  Further, the study estimated that once there are more driverless vehicles on the road that are able to platoon with each other, energy savings may jump to 273 percent.

Federal agencies have long promoted innovative technologies, from GPS to the Internet. As the largest purchaser of goods and services and operator of the largest vehicle fleet in the world, the federal government and USPS have the potential to usher in the driverless car revolution.


Sources: DoD Considers 3 Options for JIEDDO

Defense News

Jul. 6, 2013 – 06:00AM |


WASHINGTON — Senior US defense officials are preparing to determine the future of a powerful, high-profile Pentagon organization that has spent nearly a decade developing equipment, tactics and training to defeat roadside bombs.

Last month, House lawmakers included a provision in their version of the 2014 defense authorization bill that requires the Defense Department to provide a report on the future of the Joint Improvised Explosive Device Defeat Organization (JIEDDO).

At a time when the Pentagon is facing hundreds of billions of dollars in spending cuts over the next decade, senior military leadership is said to be considering three options for restructuring JIEDDO: eliminate the organization; break up its duties among the military services through a process called disaggregation; or restructure JIEDDO into a smaller office within the Office of the Secretary of Defense (OSD).

In March 2011, then-Defense Secretary Robert Gates called for the elimination of the JIEDDO director billet, a position held by four different three-star generals since 2008. The elimination would be “based upon deployment of forces and IED threat,” Gates wrote in a memo at the time.

But supporters of JIEDDO said the counter-IED mission must be preserved through the Quadrennial Defense Review, which lays out future US military strategy and is due to Congress early next year. These supporters point to recent intelligence assessments that say terrorist networks will continue to use IEDs against the United States and its allies.

“We have to realize that the IED is part of our operational environment now,” said retired Army Command Sgt. Maj.Todd Burnett, a former senior enlisted adviser to JIEDDO.

A May Center for Naval Analyses assessment of the “post-Afghanistan IED threat” found the IED will likely persist in the coming years.

With that in mind, JIEDDO supporters argue that the third option — creating a smaller office within OSD — would be best.

“DoD needs a small, scalable, agile, OSD-level organization with special authorities, ramp-up ability and flexible funding to implement and synchronize … enduring counter-IED capabilities,” a defense official said.

Since its birth in 2006, JIEDDO has spent about $20 billion, according to budget documents. Spending peaked near $4 billion in 2008, around the time of the surge in Iraq. Since then, spending has declined to about $2 billion. A scaled-down counter-IED organization would likely cost about one-fourth of that, a defense official said.

Officials close to JIEDDO said the office has already cut costs, and they point to the cancellation this year of a number of underperforming programs.

These cancellations have allowed the office to reinvest more than $289 million in training and to purchase reconnaissance robots and bomb-detection equipment. The JIEDDO office is expected to cut 22 percent of its staff by September, a reduction expected to save $163 million.

The majority of the money spent by JIEDDO has gone toward what it calls defeating the device, or purchasing systems and equipment to detect or protect soldiers from IEDs. This includes purchases of robots, electronic jammers, vehicles and even aerostats.

The equipment includes both US and foreign-made systems, such as more than 800 British-built Self-Protection Adaptive Roller Kits, giant rollers that can be mounted on vehicles to detect roadside bombs

The rest of the funding has gone toward intelligence used to go after IED networks and training equipment.


The Options on the Table

In January, the Joint Requirements Oversight Council, a panel that vets military requirements, said the Pentagon must maintain counter-IED capabilities, including the ability to identify threat networks that employ or facilitate IEDs, detect bombs and components, prevent or neutralize bombs, mitigate explosive device efforts, distribute bomb-related data across the the community of interest and train personnel in counter-IED capabilities.

Since then, three options have emerged as likely courses of action, sources say.

The first — eliminating JIEDDO and its mission — is not likely, a defense official said. The two more likely courses of action are scaling down the existing organization or delegating the training and equipping mission to the services through disaggregation.

If the disaggregation option is chosen, many of JIEDDO’s components could be split among the services, with acquisition authority most likely going to the Army, the official said.

JIEDDO reports to OSD and has special acquisition authority, allowing decisions and purchases to move quicker.

Through disaggregation, each of the services would likely be responsible for its own training, which supporters of JIEDDO said means different methods and equipment might be used.

Also unclear is how the intelligence apparatus within the organization would be restructured.

The other option is consolidating JIEDDO into a smaller OSD-level organization. An organization under this framework would be best equipped to rapidly procure counter-IED equipment, officials said. Special acquisition authority used by JIEDDO could be applied to this organization, allowing it to field equipment, quicker.

JIEDDO’s goal is to field what it calls capabilities in four to 24 months. After that time frame, the initiatives typically become official programs of record or terminated.

A review of 132 initiatives deployed showed that 93 — with a total price tag of $5.9 billion — were proved “operationally effective.” An additional 18, costing $900 million, were “operationally effective with some limitations in capability.” An additional 21 — totaling $400 million — were “not operationally proven,” or lacked evaluation information.

A key aspect of JIEDDO likely to be retained in a consolidated organization is the Counter-IED Operations/Intelligence Center (COIC). The center provides operational intelligence and analysis on threat networks to commanders in the field by fusing more than six dozen data sources.

The COIC also regularly interacts with more than two dozen US government intelligence agencies and international partners, including Canada, the UK, Australia and NATO.


An International Problem

IEDs are seen as a threat globally, not just in places like Iraq and Afghanistan. Since January 2011, more than 17,000 IED “events” have occurred in 123 countries, according to David Small, a JIEDDO spokesman. Outside Afghanistan, there are an average of 700 IED events each month.

Between December 2012 and May, Iraq experienced 3,352 incidents, the most of any country other than Afghanistan. Colombia experienced 1,005 during that period, with Pakistan third at 883. Syria, which has been in the midst of a civil war, has experienced 382 IED incidents.

In May, JIEDDO signed an agreement with Pakistan to minimize the IED threat. The arrangement allows sharing of information, including tactics, finding of IED incidents, lessons learned, information about IED financiers and information about the flow of IED materials.

Joe Gould contributed to this report.



Activity-Based Intelligence Uses Metadata to Map Adversary Networks

Defense News

Jul. 8, 2013 – 02:59PM |

By Gabriel Miller     


Few outside the intelligence community had heard of activity-based intelligence until December, when the National Geospatial Intelligence Agency awarded BAE Systems $60 million to develop products based on this newish methodology. But ABI, which focuses not on specific targets but on events, movements and transactions in a given area, is rapidly emerging as a powerful tool for understanding adversary networks and solving quandaries presented by asymmetrical warfare and big data.

Indeed, ABI is the type of intelligence tool that could be applied to the vast wash of metadata and internet transactions gathered by the NSA programs that were disclosed in June by a whistle-blower.

In May, the U.S. Geospatial Intelligence Foundation’s Activity-Based Intelligence Working Group hosted a top-secret forum on ABI that drew representatives from the “big five” U.S. intelligence agencies.

At the SPIE 2013 Defense, Security + Sensing Symposium on May 1, NGA Director Letitia Long said the agency is using ABI to “identify patterns, trends, networks and relationships hidden within large data collections from multiple sources: full-motion video, multispectral imagery, infrared, radar, foundation data, as well as SIGINT, HUMINT and MASINT information.”

The technique appears to have emerged when special operators in Iraq and Afghanistan reached back to NGA analysts for help plugging gaps in tactical intelligence with information from national-level agencies. These analysts began compiling information from other intelligence disciplines — everything from signals intelligence and human intelligence to open sources and political reporting — and geotagging it all. The resulting database could be queried with new information and used to connect locations and establish a network.

This experience led to a series of seminal white papers published in 2010 and 2011 by the Office of the Undersecretary of Defense for Intelligence. The papers call ABI “a discipline of intelligence where the analysis and subsequent collection is focused on the activity and transactions associated with an entity, population, or area of interest.”

This focus on interactions is the fundamental difference between ABI and previous efforts to integrate different types of intelligence, which were often confined to a single agency and aimed at a specific target.

“When we are target-based, we focus on collecting the target and, too often, we are biased toward what we know and not looking for the unknown,” NGA’s Dave Gauthier said last year at GEOINT 2012. Gauthier, who handles strategic capabilities in the agency’s Office of Special Programs, called ABI “a rich new data source for observing the world and the connectedness between objects and entities in the world.”

ABI attempts to meet two challenges with traditional intelligence-gathering. First, there are no clear signatures for and no doctrine governing the activities of nonstate actors and insurgents who have emerged as the most important threats to U.S. national security. Second, the volume of big data has become “staggering,” in Gauthier’s words. Take, for example, the recent bombing in Boston: There was a massive amount of surveillance imagery available, but analysts initially had no idea whom they were looking for, and moreover, the suspects turned out to look little different from thousands of other spectators on hand.


“ABI came out of the realization that the scheduled, targeted, one-thing-at-a-time, stove-piped analysis and collection paradigm was not relevant to non-nation-state and emergent threats,” said Patrick Biltgen, a senior engineer in the intelligence and security sector at BAE Systems. “We are breaking this one-thing-after-another paradigm because information is flowing … all the time and we don’t know what to do with it because if you’ve stopped to try and collect it, you’ve missed everything else that’s coming.”



Though the USD(I) white papers call ABI a new discipline, many prefer to think of it more as a methodology with several components.

The first is the constant collection of data on activities in a given area, then storing it in a database for later metadata searches. The NGA’s Long recently said the agency is working to create a “model that allows us to ‘georeference’ all of the data we collect persistently — over a long period of time,” one that allows “analysts to identify and evaluate data down to the smallest available object or entity.”

The second is the concept of “sequence neutrality,” also called “integration before analysis.”

“We collect stuff without knowing whether it’s going to be relevant or not. We may find the answer before we know the question,” said Gregory Treverton, who directs the Rand Center for Global Risk and Security. “It’s also not so driven by collection; the collection is just going to be there.”

The third is data neutrality — the idea that open-source information may be just as valuable as HUMINT or classified intelligence.

“Humans, unlike other entities, are inherently self-documenting. Simply being born or going to school, being employed, or traveling creates a vast amount of potentially useful data about an individual,” the white papers say. This tendency has exploded on the Internet, “where individuals and groups willingly provide volumes of data about themselves in real time — Twitter and social network forums like Facebook and LinkedIn are only a few examples of the massive amounts of unclassified data that is routinely indexed and discoverable.”

Finally, there is knowledge management, which covers everything from the technical architecture that makes integrated intelligence and information-sharing possible to the metadata tagging that allows analysts to discover data that may be important, but not linked spatially or temporally.



ABI products take the form of customizable Web-based interfaces that allow analysts to locate associations among data sets using metadata.

“You could call them Web services, apps, widgets, but they help analysts sift through large volumes of data,” said BAE Systems’ Biltgen.

These do not compete with giant systems like the armed services’ Distributed Common Ground Systems, end-to-end databases that connect thousands of users with intelligence information. Rather, they are generally designed to plug into DCGS, then help smaller working groups deal with specific problems.

“Really, what we’re doing is working with the metadata — the dots and the indexes and extracted ‘ABI things’ — to get those on the screen, whereas the large systems really manage streams of imagery for exploration,” Biltgen said. “We go, ‘Let’s take clip marks and the tags that come from exploited video streams and look at all of them at the same time without ever having to touch a frame of video.’ “

He said the goal is to “precondition the data and make it easier for the analyst to correlate them, apply their cultural awareness and knowledge to them, and really put the thought muscle on the data after it’s been well conditioned.”

So what does ABI actually produce? One common format is activity layer plots. An analyst might, for example, place all available intelligence about an explosion of an improvised explosive device atop information about a kidnapping in the same area, then lay in data about the local bus line, the fruit market at the corner, or the local timber-smuggling operation.Once displayed, the information may overlap or intersect in interesting ways.

To date, ABI has primarily been used in the kinds of operations that have defined Iraq and Afghanistan: manhunting and uncovering insurgent networks. But because ABI is more a methodology than a discipline, and because the products that enable ABI are customizable, the intelligence community sees ABI applied to a broad range of problems.

“The immediate question is, can we expand it beyond counterterrorism and manhunting and the fight against terror?” Treverton said.

He suggested applications such as maritime domain awareness, in which signatures exist for Chinese frigates but not junks.

ABI can theoretically be brought to bear on any problem that might be aided by a “pattern of life” analysis, a prominent phrase in the white papers. In finance, for example, ABI might identify patterns left by a particular kind of criminal.

“You could use this in the insurance industry to try and understand the patterns of life of individuals that steal things from you and make false claims. We do some of that work today,” Biltgen said.

While ABI can help anticipate patterns, advocates don’t claim it can predict future behavior.

“I wouldn’t call it predictive,” Treverton said. “I wouldn’t call anything predictive. That’s asking way too much.”

Still, it may help officials anticipate threats by building a deep understanding of the networks that give rise to specific incidents.



Two things could hinder ABI — one technical, one cultural.

It sounds relatively uncomplicated to develop a visual network, say, by tracing all of the tire tracks captured by wide-area motion video in a given area over a period of time. Origins and destinations become nodes, and hundreds or even thousands of tire tracks describe a network from which analysts can extract meaning. But the devil is in the details. For example, it is difficult to define a “vehicle stop” in an algorithm, much less assign meaning to it. Does a “stop” last five seconds or one minute?

“It sounds easy, until you touch the data. You realize that every proposition in that value chain has hidden complexity,” said Gary Condon, an intelligence expert at MIT’s Lincoln Lab, at GEOINT 2012.

The second set of issues are cultural. Even in the post-9/11 era, legal boundaries and security clearances can prevent the kind of data-sharing that makes ABI work. The quantity of publicly available information swells by the day, but the intelligence community still often prizes classified over open-source information. And just as complex: Some of that open-source intelligence raises privacy concerns when U.S. persons are involved.

That’s been at the heart of the outcry over the NSA’s Prism program and phone-record collection.

Still, top-level intelligence officials see ABI as a valuable new tool. Several senior officials from the Office of the Director of National Intelligence remarked on its growing importance at the U.S. Geospatial Intelligence Foundation forum in early May.
“The defense and intelligence worlds have undergone, and are still undergoing, a radical transformation since the events of 9/11. The Department of Defense and the Director of National Intelligence have made information sharing and efficiency priorities,” the spokesman said. “This will increase collaboration and coordination, which will have a multiplying effect on approaches such as ABI.”



Analysis: Policies and Opportunities That Will Shape Cybersecurity Spending

Special to Homeland Security Today

By: Stephanie Sullivan, immixGroup Inc.

07/08/2013 (11:16am)

Editor’s Note: Homeland Security Today has partnered with immixGroup Inc. to bring you exclusive market insight and analysis.

In this installment, Stephanie Sullivan, Market Intelligence Consultant, offers a look at the major White House and Congressional efforts impacting cybersecurity programs throughout the federal government, as well as some of the main contracting opportunities on the cyber horizon.


As cyber threats continue to dominate the headlines, it is important for the innovators in the government security market to understand how the legislative and executive branches are tackling cybersecurity and the potential ramifications of these efforts for industry.  

FY14 Legislation Impacts on Cyber

These are some of the several legislative directives that could impact the commercial-of-the-shelf (COTS) vendor community in FY14, and aim to encourage the adoption of cybersecurity best practices on a voluntary basis. The underlying motivation of these directives is to spur industry and government collaboration on information sharing and defending networks.

The framework proposes to allow intelligence gathering on cyber-attacks and cyber threats, as well as address network security gaps in critical components of U.S. infrastructure, including banking, utility, and transportation networks.

NIST in collaboration with GSA, DOD, and DHS released a Request for Information (RFI) last February in order to gather feedback from industry and relevant stakeholders regarding the development of the framework, and has been holding a series of workshops to identify priority elements the framework must address.

An initial draft of the framework was publicly released on July 1st with revisions expected to be made following the 3rd Cybersecurity Framework Workshop being held on July 10-12th in San Diego, and will be expanded and refined leading into the fourth workshop anticipated to be held in September. Additional framework milestones include the release of the preliminary version due in October; with a final version expected in February 2014.

Keep an eye on this – participating in stakeholder engagements and familiarizing yourself with the draft guidelines will be critical to all COTS vendors, because you need to understand how your products and solutions can enhance the framework and meet these ‘voluntary’ but critical security needs. After all, the end goal of these working groups will be to eventually bake cybersecurity standards into federal acquisitions to ensure cyber protection.

  • The Presidential Policy Directive – 21 or PPD 21 on Critical Infrastructure and Security Resilience is serving as a replacement and update to 2003 Homeland Security PPD – 7, and was also issued on February 12, 2013 as a complement to the Cybersecurity Executive Order.  PPD – 21 defines what critical infrastructure is and encourages the Federal Government to strengthen the security and resilience of its own critical infrastructure, which is outlined in the directives three strategic goals. It also defines sector-specific agencies (SSAs) for critical infrastructure segments, and mandates information sharing and cooperation between the SSAs, state & local organizations, and international partners.  

The new policy establishes “national critical infrastructure centers” in the physical and cyber space designed to promote information sharing and collaboration, as well as ordering the State Department to work with DHS on issues of international interdependencies and multi-national ownership, and growing concerns of the global economy. However, some speculate that not enough has changed from the former Presidential Directive to be truly noteworthy.

  • The Cyber Intelligence Sharing and Protection Act (CISPA) is a bill designed to encourage voluntary information sharing between private companies and the government in order to gain information surrounding incoming cyber threats. In a perfect scenario a private company, like an Amazon or Google, would identify unusual network activity that may suggest a cyber attack and alert the government, or if the government detected a threat to a private business network they would share their findings.

The bill was originally introduced into Congress last year, but privacy concerns proved to be a major roadblock, and the bill didn’t make it to the Senate floor. The bill could meet the same fate this year, even after it was passed by the House of Representatives on April 18, 2013. The NSA PRISM program has halted any movement regarding cybersecurity legislation until at least September, if not further down the road due to increased scrutiny of private information sharing.

One of the provisions of note calls for mandatory reporting requirements by defense contractors when there has been a successful cyber penetration. Additionally, the NDAA also calls for improved monitoring and alert technologies to detect and identify cybersecurity threats from both external sources and insider threats. The NDAA also contains a provision aimed at addressing longstanding concerns over elements of the Pentagon’s supply chain. The NDAA hints that statutory requirements to address this problem may be down the road. DOD is encouraged to cooperate with industry.   

FY14 Federal IT Sales Opportunities in Cyber

The federal government plans to spend about $13 billion in FY14. This reflects the fact that cybersecurity continues to be a strategic concern for federal agencies. Just as important, cybersecurity will benefit from bipartisan reluctance to curb spending in this high profile area. Fiscal constraints do exist, however, and agencies will have to be circumspect in how they earmark money. The following are a small selection of programs with significant cybersecurity requirements and large allocations for new starts. It is important to understand which programs have funding and map your solutions to these programs.

FY14 Opportunities: Civilian

Funded cybersecurity opportunities within the civilian arena can be found in almost every Executive Branch agency. Below are the top three civilian programs by Development, Modernization and Enhancement (DME) funding – money used to buy new products.

  • Department of Homeland Security (DHS) National Protection and Programs Directorate (NPPD) – The Continuous Diagnostics and Mitigation (CDM) program is the agency’s largest cybersecurity investment dedicated to continuous monitoring, diagnosis, and mitigation activities to strengthen the security posture across federal .gov domain. This investment will assist DHS in overseeing the procurement, operations and maintenance of sensors and dashboards deployed to federal agencies.
    • FY14 DME IT spend for CDM is $121.4 million
  • Department of Commerce (United States Patent and Trademark Office (USPTO)) – Network and Security Infrastructure investment describes the IT operations and services provided to the USPTO and external customers by the OCIO Enhancements and upgrades of this IT infrastructure will include firewall enhancements, antivirus software, network security, data protection and compliance too.
    • FY14 DME IT spend for NSI is $89.5 million
  • DHS (NPPD) – The National Cyber Security Division, through its National Cybersecurity Protection System (NCPS), which is operationally known as ‘Einstein’, protects the Federal civilian departments and agencies IT infrastructure from cyber threats. Potential FY14 requirements for this program could include: intrusion prevention, intrusion detection, and advanced cyber analytics.
    • FY14 DME IT spend for NCPS is $72 million

FY14 Opportunities: Defense

Generally speaking, cybersecurity opportunities within the Department of Defense can be found within major network and infrastructure programs. Below are the top three defense programs by Development, Modernization and Enhancement (DME) funding – money used to buy new products.

  • Warfighter Information Network Tactical System Increment (WIN-T): High speed, high capacity tactical communications network serving as the Army’s cornerstone tactical communications system through 2027. Developed as a secure network for video, data, and imagery linking mobile warfighters in the field with the Global Information Grid. Potential FY14 procurements include firewall enhancements, intrusion protection and detection, continuous monitoring, and encryption.
    • FY14 DME IT spend for WIN-T is $815.4 million
  • Next Generation Enterprise Network (NGEN): An enterprise network which will replace the largest intranet in the world, the Navy Marine Corps Intranet, providing secure, net-centric data and services to Navy and Marine Corps personnel. NGEN forms the foundation for the Department of Navy’s future Naval Network Environment. HP was recently awarded the contract potentially worth up to $3.5 billion. The entire gamut of information assurance requirements are at play here, specifically due to the high reliance on cloud technology that NGEN will require.
    • FY14 DME IT spend for NGEN is $195.05 million
  • Consolidated Afloat Networks Enterprise Services (CANES):  Consolidates the Navy’s multiple afloat networks into one network. CANES replaces these existing networks with new infrastructure for applications, systems, and services and will improve interoperability along the way. The RFP is currently out with an award expected this winter.
    • FY14 DME IT spend for CANES is $195.1 million


About immixGroup Inc.

Founded in 1997, immixGroup® is a fast-growing company and a recognized leader in the public sector technology marketplace. immixGroup delivers a unique combination of services for software and hardware manufacturers, their channel partners, and government agencies at the federal, state, and local levels. immixGroup is headquartered in McLean, Virginia, close to Washington, DC and near the epicenter of the government IT community.



Darpa Refocuses Hypersonics Research On Tactical Missions

By Graham Warwick

Source: Aviation Week & Space Technology

July 08, 2013


For the Pentagon’s advanced research agency, blazing a trail in hypersonics has proved problematic. Now a decade-long program to demonstrate technology for prompt global strike is being wound down, with some hard lessons learned but no flight-test successes.

In its place, the U.S. Defense Advanced Research Projects Agency (Darpa) plans to switch its focus to shorter, tactical ranges and launch a hypersonics “initiative” to include flight demonstrations of an air-breathing cruise missile and unpowered boost-glide weapon. If approved, the demos could be conducted jointly with the U.S. Air Force, which is eager to follow the success of its X-51A scramjet demonstrator with a high-speed strike weapon program.

Darpa’s original plan for its Integrated Hypersonics (IH) project was to begin with a third attempt to fly the Lockheed Martin Skunk Works-designed HTV-2 unmanned hypersonic glider, after the first two launches in 2010 and 2011 failed just minutes into their Mach 20 flights across the Pacific. This was to be followed by a more capable Hypersonic X-plane that would have pushed performance even further.

The original plan drew sharp criticism from Boeing executives, who viewed the proposed program as a thinly veiled excuse to fund a third flight of Lockheed’s dart-like HTV-2, which they consider unflyable. In laying out its revised program plan, Darpa makes no mention of any political lobbying against the HTV-2, but acknowledges a third flight would not make best use of its resources for hypersonic research.

Instead, as the Pentagon refocuses on China as a threat, Darpa is looking to work with the Air Force to demonstrate hypersonic weapons able to penetrate integrated air defenses and survive to strike targets swiftly, from a safe distance. Air-breathing and boost-glide weapons present challenges different to each other and to HTV-2, but the agency believes the lessons learned so far will prove valuable.

Key take-aways from HTV-2, says Darpa program manager Peter Erbland, include that the U.S. “has got kind of lean” in hypersonics competency as investment has declined from the heady days of the X-30 National Aero-Space Plane, and that “we have to be careful assuming our existing design paradigms are adequate” when developing a new class of hypersonic vehicles.

The HTV-2 sprung some surprises on its two failed flights, first with aerodynamics then with hot structures. Working out what happened “required us to mine all the competency in hypersonics that we have,” he says, and took a team assembled from government, the services, NASA, the Missile Defense Agency, industry and academia.

Erbland says the decision not to fly a third HTV-2 was influenced by “the substantial knowledge gained from the first two flights in the areas of greatest technical risk: the first flight in aerodynamics and flight performance; the second in the high-temperature load-bearing aeroshell.” Another factor was the technical value of a third flight relative to its cost. A third was the value of investing resources in HTV-2 versus other hypersonic demonstrations. “We’ve learned a lot; what is the value of other flights?” he asks.

While the Air Force Research Laboratory had two successes in four flights of the Mach 5, scramjet-powered Boeing X-51A, Darpa’s two HTV-2 flops followed three failures of the Mach 6, ramjet-powered Boeing HyFly missile demonstrator. But as is often the case in engineering, more is learned from failure than from success, and investigation of the HTV-2 incidents will result in more robust hypersonic design tools that increase the likelihood of future success, Erbland argues.

To ensure all lessons are absorbed, work on the HTV-2 will continue to early next summer “to capture technology lessons from the second flight, and improve design tools and methods for high-temperature composite aeroshells,” he says. Information from the post-flight investigation will be combined with additional ground testing to improve the models used to design load-bearing thermal structures—”how they heat up, the material properties, their uncertainties and variables, and how we use modeling and simulation to predict thermal stresses and responses.”

HTV-2 was intended to glide an extended distance at hypersonic speed—roughly 3,000 nm. in 20 min.—and required a slender vehicle with high lift-to-drag (L/D) ratio and a carbon-carbon structure to fly for a prolonged time at high temperatures. While Flight 1 in April 2010 failed when adverse yaw exceeded the vehicle’s control power, Flight 2 in August 2011 failed when the aeroshell began to degrade, causing aerodynamic upsets that ultimately triggered the flight-termination system.

“From the first flight it was clear our extrapolation of aero design methods was not adequate to predict behavior in flight,” says Erbland. “From the first to the second flights we redid the ground testing, and rebaselined the aero using new tools. On the second flight, the changes were completely effective, even in very adverse flight conditions.” But the modifications set up the HTV-2 for failure on the second flight.

“Changes to the trajectory made it a more severe aero-thermal environment than the first flight,” he says. “We have been able to reconstruct how it failed from the limited instrumentation, and the most probable cause is degradation of the structure. Thermal stresses led to failure.” While the vehicle retained its structural integrity, temperature gradients over small areas led to local material failures that caused the upsets.

“From the second flight, we learned a lesson on how to design refractory composites, to improve our understanding of how to model hot structures under thermal load,” says Erbland. “We learned a critical lesson about variability and uncertainty in material properties. That is why we are taking time to fund the remediation of our models to account for material and aero-thermal variability.”

HTV-2 is all that remains of the once-ambitious Falcon program (for Force Application and Launch from the Continental U.S.), started in 2003 with the goal of demonstrating technology for prompt global strike. Falcon had two elements, a hypersonic cruise vehicle (HCV) and a small launch vehicle (SLV) needed to boost the cruiser into a hypersonic glide. The SLV effort helped fund Space Exploration Technologies’ Falcon 1 booster, but the HCV went through several changes.

The original HTV-1 hypersonic test vehicle was abandoned in 2006 when the sharp-edged carbon-carbon aeroshell proved impossible to manufacture. Darpa and Lockheed proceeded with the easier-to-produce HTV-2, but then departed from the original unpowered HCV concept to propose an HTV-3X testbed, with turbojet/scramjet combined-cycle propulsion. Congress refused to fund the vehicle, dubbed Blackswift, and it was cancelled in 2008, leaving two HTV-2s as the remnants of Falcon.

Now Darpa is seeking to reinvent its hypersonics focus by moving away from the global- to the tactical-range mission. But while an air-breathing weapon can draw directly on the X-51, boost-glide over a 600-nm range is a different vehicle to the HTV-2. “To get the performance we need to look at high L/D with robust controllability. Thermal management is a different problem to HTV-2. We need robust energy management. And affordability.”

Boost-glide challenges include packaging a weapon for air and surface launch. “The mass and volume constraints are different. We had a very high fineness ratio for global strike; we will have to be very innovative to get high L/D without a high fineness ratio,” says Erbland. On the other hand, “trajectory insertion velocities are lower, and the booster problem could be more tractable. The problem with global range is that orbital launch systems with the energy needed are not designed to put a vehicle on an ideal start of glide, so we have to make them fly in ways they don’t want to,” he says.

But Darpa believes its HTV-2 experience will prove useful. “It provided critical technical knowledge to enable us to design a future boost-glide vehicle capable of prompt global strike. We made huge progress in understanding what we need to do in ground-test and flight-test to design the aerodynamics and hot structure,” Erbland says. “These are lessons we would not have learned without flight test, because of the limitations with ground test. We know going forward how to use modeling and simulation and ground test to give us more confidence that we can design a successful system.”


The State Of Broadband

Only by keeping pace with the latest in regulations, competition, and technology will companies rise above low-capacity, high-priced telecom networks.

By Jonathan Feldman, InformationWeek

July 10, 2013



We all remember the bad old days of having to load data into removable media in order to send it off to the data center. After all, it would have taken days to transmit the necessary data over slow telecom links.


Problem is, the bad old days aren’t over. Instead of shipping tapes to data centers, organizations now regularly ship entire hard drives to cloud providers. Despite tremendous advances in line speeds, it still can take a week or more to transmit very large data sets, even if your line speed is 10 Mbps. Of course, companies don’t regularly need to transfer terabytes of data over the internet, but the current level of sneakernet that’s necessary for the transfer of even a few hundred gigabytes seems a bit high for the 21st century.

The state of broadband matters to your organization. There’s been considerable consumer interest over the past several years, culminating in an FCC plan announced earlier this year to expand broadband coverage and speeds and promote competition. IT organizations can benefit by staying in touch with those regulatory issues, as well as taking advantage of new technology trends, such as wireless broadband, and partnering with alternative providers and municipal networks that buck the status quo. There are clearly risks in doing so, but taking no action almost guarantees that enterprise IT, with pockets of presence in rural and other nonurban areas, will continue to be held back by low-capacity, high-expense networks.

There are many reasons why the state of consumer broadband should matter to enterprise customers:


Problem With The Status Quo

In June, National Cable and Telecommunications Association CEO Kyle McSlarrow called America’s broadband deployment over the last 10 years “an unparalleled success story,” alluding to the rise of cable IP networks and faster and more extensive broadband in the consumer market. He’s right by some measures. Among the G7 countries, even though the U.S. is only No. 5 in broadband penetration (see chart on previous page), it’s been making headway. But when you look at average broadband prices worldwide, the U.S. doesn’t compare favorably–service in the United Kingdom, Sweden, France, Japan, Korea, Germany, and many other industrialized countries is cheaper, on average. And when you look at broadband subscribers per 100 inhabitants, the U.S. is ranked No. 22, slightly above the Organisation for Economic Co-operation and Development average but below the Scandinavian countries, Korea, Canada, France, the U.K., and others.

As with many things, where you stand depends upon where you sit. Tony Patti, CIO for S. Walter Packaging, a century-old manufacturing company in Philadelphia, says that even in the SOHO market, significant bandwidth is for sale relatively cheaply (see chart, below). “People always want more for less, but we’re at a remarkable and revolutionary time in the history of the convergence of computing and communications,” Patti says. But the two key questions are these: Are you in the provider’s service area; and if you are, does the actual speed match the advertised speed? In major markets, the answer is: probably. But talk to someone in smaller cities and rural America, and a different story emerges.

Kris Hoce, CEO of Pardee Hospital, a 200-bed facility in Hendersonville, N.C., says the hospital’s telecom lines are “stretched” today, and when the management team looks at tomorrow’s challenges, including telemedicine and telemetry, he gets even more concerned.

Until a second competitor, Morris Broadband, entered the market a year ago, the incumbent provider was Pardee’s only option. “You’ll take whatever capacity they give you, do it on their time schedule, and you’ll pay through the nose for it,” Hoce says. Since Morris Broadband’s entry, Pardee has realized a 10% to 15% reduction in telecom costs, though it can’t always get sufficient bandwidth, he says.


National Broadband Plan

The FCC’s 376-page National Broadband Plan, while a testament to the ability of federal bureaucracy to fill large amounts of paper, stands to benefit enterprise IT over the next few years in several areas, if the agency follows through.

First, the FCC says that it will be publishing market information on broadband pricing and competition. Will this be as useful as PriceWatch and eBay are in determining what you should pay? We’re not sure. But transparency itself should help: A market where all players know what everybody’s charging tends to be one where prices dip as low as possible.

Second, the FCC says it will make additional wireless spectrum available, and it will update its rules for backhaul spectrum. President Obama has thrown his weight behind this movement, directing the National Telecommunications and Information Administration–the folks behind the broadband stimulus–to help the FCC with a plan to make 500 MHz of spectrum available by the fourth quarter of this year.

It’s unclear what the licensing procedures will be, and for which portion of the additional spectrum. Our bet: some mix of unlicensed spectrum (like 2.4 GHz, a nightmare for IT departments that want to avoid interference), some fully licensed (like 800 MHz, whose paperwork can take months or years to get processed), and some “lightly licensed” (like the 3,650-MHz band that was allocated for WiMax in 2005, which requires two or more licensees in the same region to cooperate). When additional spectrum comes online, it should revitalize the market and create product innovations, which should make broadband wireless a bit less difficult for enterprises to deploy.

The FCC also plans to improve rights-of-way procedures. Power and other companies that own poles either have undocumented or onerous agreements for anyone wanting to attach to a pole or bridge. Streamlining and standardizing this process would be welcome news to telecom market entrants and user organizations that want to bypass the telecom providers. The unanswered question is, how will the FCC “encourage” rights-of-way owners to improve these procedures?

The National Broadband Plan also stipulates longer-term (within the next decade) goals, including that 100 million consumers are able to access affordable 100-Mbps actual download speeds, 50-Mbps upload–more than 10 times faster than what most U.S. consumers can now get. More interesting to enterprise IT, the plan outlines a goal of affordable access to 1-Gbps links for “anchor institutions”–hospitals, community centers, schools, and so on. As these institutions get affordable links, other large institutions, like big companies, will also get affordable high-speed links.

The FCC doesn’t always have the authority to say how these goals will be accomplished. But in the “implementation” chapter of the National Broadband Plan, it suggests who (including the FCC) should pursue them. For example, it recommends that the executive branch create a “broadband strategy council” consisting of advisers from the White House and its Office of Management and Budget, NTIA, FCC, and other agencies. The FCC also has committed to publishing an evaluation of its progress as part of its annual 706 report, named after section 706 of the Telecommunications Act of 1996. You can track 706 reports at


Emerging Competition

Simplifying and streamlining the status quo won’t be as quick as we want it to be, but the situation isn’t bleak.

True, many of the wireline highways are owned by the same folks that own the off-ramps and have a big interest in resisting competition (the likes of AT&T, Verizon, and Qwest from the telco sector and Comcast, Time-Warner, and Cablevision from cable TV). But competition is in fact emerging.


Players like Morris Broadband serve relatively small and rural areas, catering to customers the larger players simply won’t touch. CenturyLink, a larger player, serves rural customers in 33 states. PAETEC competes in 84 of the top 100 areas, known as “metropolitan service areas,” which are anything but rural. Then there are municipal broadband projects such as LUS Fiber, a fiber-to-the home network started by the utility in Lafayette, La., that offers business services (10-Mbps symmetric) starting at $65 a month.

It’s hard to get information out of the incumbents–we tried, but folks like Verizon said that they don’t see how consumer broadband is related to serving enterprise customers. But the conventional wisdom is that they won’t serve an area unless they can get 25 potential customers per mile. Smaller players will look at areas with five or 10 potential customers per mile. Bottom line: Whenever competitors enter a market, prices fall. In a striking irony, the incumbents opposed to broadband regulation have lobbied local and state authorities to prevent broadband buildouts by municipal entities.

In addition to the wireline broadband alternatives, consider that the airwaves are wide open. Wireless ISPs like Clear and mobile phone and 3G data providers like T-Mobile and Verizon Wireless are interesting, but your bandwidth and reliability may vary when attempting to use their business-class SOHO service. That said, back in the day of the bag phone, nobody would rely on a cell phone for anything that was hugely important, but that didn’t keep IT organizations from playing with them in noncritical areas.

We’re also interested by the services offered by the likes of Texas-based ERF Wireless, which is completely focused on serving businesses, mainly banking and oil companies. ERF’s model: Customers invest in their own wireless infrastructure to backhaul to ERF’s network and then pay an ongoing port fee to access a secured backbone. CEO Dean Cubley says ERF’s banking customers pay about half of what they were paying to incumbent providers and have about a three-year payback on their capital investment.

Jacobson of North Carolina not-for-profit NCREN says the group’s successful BTOP round 1 application (awarded $28.2 million) came from efforts by the state’s office of economic recovery. It’s going to trickle up to the hospitals, too. “All the medical schools in the state are on NCREN today,” he says, and “the nonprofit hospitals will be eligible to interconnect to us as well.”


Welcome Back To Sneakerville

Some caution is necessary. There will be no shortage of poorly conceived broadband initiatives. Savvy IT organizations will stay close to operations, leaving the speculation to investors and economic development types.

Moving beyond sneakernet will require more than just fatter pipes. “Civil engineers discovered some time ago that building more lanes on highways does not really relieve traffic problems,” says Mark Butler, director of product marketing with Internet services company Internap. “Relief comes when you use the available capacity in a more efficient manner.”

So as you keep track of the legislation and other craziness coming out of Washington, keep pace with technical realities, lest you invest in higher-speed lines only to find that your use case isn’t quite as you had planned. George Bonser, a network operator with mobile messaging provider Seven, cites cases of companies that install high-speed lines and then discover they can’t get anywhere near their theoretical limit because of the software in use. It’s a complicated matter that deserves your attention in the same way that keeping track of broadband competition, accessibility, and fairness does.





NIST seeks input on cybersecurity framework

Upcoming Cybersecurity Framework workshop this week aims for feedback from private sector on practices that can reduce the risk of cyber attacks


Cynthia Brumfield, CSO

July 09, 2013

Starting tomorrow, July 10th, in San Diego, the National Institute of Standards and Technology (NIST) will host the third, and perhaps most important, in a series of workshops aimed at developing a voluntary comprehensive cybersecurity framework that will apply across sixteen critical infrastructure sectors.

Mandated by an Executive Order (EO) issued by President Obama on February 12, 2013, the NIST-developed framework represents the first time the federal government has sought to prescribe a wide-ranging approach to protecting critical cyber assets, a tough task that has been characterized by Department of Homeland Security Secretary (DHS) Janet Napolitano as an “experiment.” The framework must be accomplished in preliminary form by October and finalized by February 2014.

During the San Diego workshop, NIST will for the first time delve into details of the emerging framework, which is based on two earlier workshops as well as formal comments NIST received in response to a public notice. To speed things along ahead of the workshop, NIST has issued three reference materials — a draft outline of what the framework might look like, a draft framework “core” that focuses on key organizational functions and a draft compendium that features existing references, guidelines, standards and practices.

Based on the recommendations of industry commenters, NIST has placed a large emphasis in the draft framework on reaching the very senior levels of management, including CEOs and boards of director. Top “officials are best positioned to define and express accountability and responsibility, and to combine threat and vulnerability information with the potential impact to business needs and operational capabilities” NIST states in the draft outline.

This focus on top executives has not surprisingly been praised by industry participants.

“Cybersecurity is just not a technological problem,” Jack Whitsitt, Principal Analyst of energy industry cybersecurity consortium EnergySec said. “This is a business management, business maturity problem. People build what you tell them to build, people build what you fund them to build. Unless we do a better job at the business side of cybersecurity, the problems won’t go away.”

Many cybersecurity experts say that reaching that top level of management is one of the biggest challenges to ensuring adequate cybersecurity protection of critical assets. CEOs, they say, typically engage in “cybersecurity theater,” implementing hollow programs that only pay lip service to the issues.

“The reality is that most of the CEO’s are relying on their trade organizations to ‘fix the problem’ for them,” one top cybersecurity consultant said. “And the trade organizations are one of the loudest voices in the echo chamber convincing themselves that this is all just a bunch of low-probability hype and a stepping stone to more regulation.”


Another challenge, at least so far as a federal framework is concerned, is the division of responsibilities among government agencies as spelled out in the EO and accompanying Presidential Policy Directive (PPD). For example, DHS has been assigned a number of tasks under the EO that seem to relate to the framework, such as defining what constitutes critical infrastructure.

Some asset owners have suggested that there are too many moving parts in the overall cybersecurity landscape and have noted rising tensions between NIST, an arm of the Commerce Department, and DHS.


“NIST and DHS aren’t doing a good job in deciding how this is going to work,” one expert noted.


But one senior government official overseeing the process said that many cybersecurity efforts in the EO and PPD just aren’t relevant to how the framework gets developed.


“The framework is supposed to work for the widest range of industries” and therefore it doesn’t matter how critical infrastructure gets defined, for example.


“DHS is making the decision that has no bearing on this framework,” he said, adding that it is likely that the list of critical infrastructure assets won’t be made public anyway.


Yet another challenge is the degree to which the framework process is being shaped by technology vendors and consultants, who far outnumber asset owners in the workshop meetings held to date. Although NIST wants to bake-in cybersecurity through vendor-supplied technology, thereby ensuring that even small organizations which lack resources to pay cybersecurity specialists are guaranteed basic protection, some asset owners balk at being force-fed technology that may better fit vendor agendas than their own safety. One telecom cybersecurity specialist said he wished that NIST would separate asset owners from vendors and consultants in the workshop sessions.


Despite these challenges, most of the participants in the process believe that NIST is on track and that the draft framework materials released for the July workshop meet expectations. However, the real action will take place at the workshop as NIST go into greater detail on where they’re headed with the framework.


With only about three months left to meet the October deadline, most of the key players are taking a wait-and-see attitude, hoping to gain a better sense of the situation until after the workshop in San Diego. As one telecom industry representative said “we have to see whether this whole process has the result we’re looking for, which is to improve our cybersecurity posture, and not some feel-good government exercise.”


Cynthia Brumfield, President of DCT Associates, is a veteran communications industry and technology analyst. She is currently leading a variety of research, analysis, consulting and publishing initiatives, with a particular focus on cybersecurity issues in the energy and telecom arenas.



North Dakota company specializes in aerial crop imagery


by Press • 9 July 2013

By: Jonathan Knutson


GRAND FORKS, N.D. — When David Dvorak launched Field of View in 2010, he foresaw a bright future for aerial crop imagery. Today, after working with farmers, agronomists and even a South American plantation manager, he’s more optimistic than ever.

“A few years ago, there was some behind-the-scenes interest in this,” says Dvorak, CEO of Grand Forks, N.D.-based Field of View.

Now, “I’m quietly confident there’s this perfect storm brewing where the precision agriculture market really takes off and the civil UAS (unmanned aircraft system) market takes off. They’re both on a trajectory to make that happen about the same time,” he says.

Field of View’s mission is to “bridge the gap between unmanned aircraft and precision agriculture,” according to the company’s website.

Its flagship product, GeoSnap, is an add-on device for multispectral cameras mounted on either manned or unmanned aircraft. Such cameras capture images in the red, green and near-infared bands, allowing users to visualize plant stress better than they can with most other camera systems, Dvorak says.

GeoSnap takes images captured by the multispectral camera and maps them with real-world coordinates, a process known as georeferencing. That allows users to know the aerial images’ exact location on the ground.

“It’s a very complex process. We developed a product that hopefully makes the process easier,” Dvorak says.

GeoSnap costs about $5,000 per unit, with the multispectral cameras costing about $4,000 each.

Field of View only recently began selling the add-on devices. So far, the company has sold a half-dozen, including one to NASA.

Dvorak thinks NASA will use the GeoSnap to learn more about vegetative cover on Earth, though he isn’t sure of specifics.

GeoSnap generally has drawn more interest overseas because other countries have fewer restrictions on air space, he says.

- See more at:


Hagel warns senators of 2014 budget dangers

By Amber Corrin

Jul 10, 2013

In a July 10 letter to lawmakers on the Senate Armed Services Committee, Defense Secretary Chuck Hagel warned of potentially dire threats to national security if Congress fails to reverse steep budget cuts for the 2014 fiscal year.

Hagel advised lawmakers that a potential $52 billion budget cut for fiscal 2014, which would be mandated under sequester spending caps imposed by the 2011 Budget Control Act, would continue to erode military readiness and weaken national defenses.

“I strongly oppose cuts of that magnitude because, if they remain in place for FY 2014 and beyond, the size, readiness and technological superiority of our military will be reduced, placing at much greater risk the country’s ability to meet our current national security commitments,” Hagel wrote in to Sens. Carl Levin and James Inhofe, the committee’s chairman and ranking member, respectively. “This outcome is unacceptable as it would limit the country’s options in the event of a major new national security contingency.”

The secretary warned that “draconian actions” would be necessary to meet the budget-cut requirements. His comments stem from findings in the Strategic Choices and Management Review he directed earlier this year.

Such moves could include ongoing hiring freezes and layoffs as Defense Department officials seek to avert a second year of furloughs. Cutbacks in training and readiness could continue, and investments in areas such as research and development would also decline. DOD’s sustained efforts in acquisition reform additionally would take a hit, he said.

“The department hopes to avoid a second year of furloughs of civilian personnel, but DOD will have to consider involuntary reductions in force to reduce civilian personnel costs,” Hagel wrote. “The resulting slowdown in modernization would reduce our long-term, critically important and historic technological superiority and undermine our better buying power initiatives.”

Hagel called on members of Congress to cooperate with the Pentagon, the White House and each other to help mitigate what he deemed to be serious adverse consequences. He urged congressional support for controversial measures proposed by President Barack Obama in his 2014 budget, including slowed growth in military pay raises, increased TRICARE fees and the retirement or cancelation of lower-priority weapons programs.

Hagel also asked Congress to eliminate restrictions on military drawdown timelines and firing practices to reduce poor-performing civilian personnel, and reiterated his push for another round of the Base Realignment and Closure Act.

Training and modernization remain poised to take the biggest hits in the 10 percent across-the-board cuts. Cutbacks in training programs already in place under this year’s sequestration would have to continue or be accelerated, putting troops and citizens at greater risk, Hagel wrote. New programs would be hard-hit as well.

“DOD would be forced to sharply reduce funding for procurement, [research, development, testing and evaluation] and military construction. Indeed, cuts of 15 percent to 20 percent might well be necessary,” Hagel said. “Marked cuts in investment funding, especially if they continue for several years, would slow future technology improvements and ay erode the technological superiority enjoyed by U.S. forces.”

He also warned that cuts would spill over into private industry as purchases and acquisition plans would be interrupted and costs increased.

“Defense industry jobs would be lost and, as prime contractors pull back and work to protect their internal work forces, small businesses may experience disproportionately large job losses,” Hagel wrote.


Sequestration Would Force Civilian Personnel Cuts in 2014, Hagel Says

By Eric Katz

July 11, 2013

The Defense Department is considering civilian reductions in force in fiscal 2014 to match reduced budget levels required by sequestration.

In a letter to the Senate Armed Services Committee, Defense Secretary Chuck Hagel said that while he is “fully committed” to enacting President Obama’s budget, he is currently planning a  “contingency plan” in case sequestration remains in effect.

“DoD is hoping to avoid furloughs of civilian personnel in fiscal year 2014,” Hagel wrote, “but the department might have to consider mandatory reductions in force.”

Hagel added the RIFs do not offer much in the way of immediate savings, but would help the department reach future budget caps. The Pentagon would have to slash $52 billion from its budget next year if Congress fails to strike a deal to end sequestration.

“While painful,” Hagel wrote, “RIFs would permit DoD to make targeted cuts in civilian personnel levels rather than the more across-the-board cuts associated with furloughs.”

Military personnel would fare better, as their funding cuts would be “disproportionately small” due to separation costs. If Congress moves forward with its plan to raise military pay 1.8 percent — rather than the 1 percent Obama called for — implementing sequester cuts would be even more difficult, Hagel said.

The Defense Department could severely trim military personnel, but it would require halting accessions, ending permanent-change-of-station moves, stopping discretionary bonuses and freezing promotions. As the Pentagon has repeatedly emphasized, continued cuts would also negatively affect maintenance, modernization and readiness.

“In sum,” Hagel said, “the abrupt, deep cuts caused by the [2011 Budget Control Act] caps in FY 2014 will force DoD to make non-strategic changes. If the cuts continue, the department will have to make sharp cuts with far reaching consequences, including limiting combat power, reducing readiness and undermining the national security interests of the United States.” 


What I learned from researching almost every single smart watch that has been rumored or announced


By Christopher Mims

July 11, 2013

Smart watches! I sure hope you like them, because literally everyone is developing one. And yet, given the vanishingly small proportion of watches that are “smart,” clearly, something is holding them back. Here are the trends that jumped out when I undertook a more or less comprehensive catalog of the forthcoming wrist-top wearables.

Smart watches are going to be big. As in physically large.

I hope you have man hands, because the average smart watch is going to have a 1.5″ display and look like one of those oversize G-shock watches that are favored by IT support guys and gym coaches. Some smart watches are actually just smartphones with a wrist band, and therefore truly gigantic.

Insufficient battery life is killing the smart watch dream.

This chart is old, but it illustrates a trend that continues to this day. (I asked the man who created it for an update, and he says none exists.) The bottom line: Moore’s law does not apply to batteries. That is, every year, we get more processing power per watt of electricity we put into a microprocessor, but battery technology is not proceeding at the same pace.

That’s a problem for a device that needs to be connected to a smartphone (via bluetooth) and/or a cell phone network. Those radios will kill your battery. (Incidentally, turning them off is the single best way to preserve the battery life of your smartphone.) And the color, back-lit, 1.5″ LCD display mentioned above? It’s not doing your smart watch battery any favors, either.

The result of all this are smart watches with only three to four days of battery life, and that’s likely to be reduced significantly as developers find new ways to make smart watches useful (and therefore force them to use their radios and change their displays more often).

Some manufacturers are talking about adding things like inductive (i.e. wireless) charging to their smart watches. That will add bulk, but dropping your watch on a charging pad every night might be way less annoying than remembering to plug it in alongside your smartphone.

Smart watches are going to come with a variety of intriguing display technologies not seen elsewhere.

Nothing  begets creativity like constraints, and given the battery issues outlined above, some makers of smart watches are turning, or have already resorted to, display technologies that require less power than traditional LCD displays.

Qualcomm’s rumored smart watch, for example, supposedly uses Mirasol, a kind of reflective, full-color display that requires no power unless it’s being updated. (Mirasol displays color by refracting light like a butterfly’s wings, rather than emitting actual red, green and blue light, like an LCD.) The Pebble smart watch uses an e-paper display like that found in the Kindle and many other e-readers. And the Agent smart watch, which just raised a million dollars on Kickstarter, uses a black and white “memory LCD” produced by Sharp, which unveiled the technology in 2008 and has been trying to find a suitably mass-market use ever since.

All of the non-LCD displays represent a compromise of some kind, when compared to the bright, extra-sharp LCD displays we’ve become accustomed to on our smartphones. This will make smart watches less a “second screen” than a place to push updates like Facebook alerts and text messages. If that sounds less useful than, say, a little smartphone, well that’s one more reason smart watches have yet to take off.

Smart watches could be the next netbooks—in other words, a huge flop.

Samsung, Apple, Google, Microsoft, LG, Qualcomm, Sony—they’re all pouring money into smart watches, but so far every indication is that the devices they’re working on are at best their take on the existing smart watch concept, which frankly isn’t all that compelling. But every consumer electronics manufacturer is looking for the next iPhone or tablet, anything to stop the red ink in their PC divisions.

Or smart watches could allow for the kind of unobtrusive, always-on computing that is the promise of Google Glass.


Thanks, local retailer, for letting me know I should buy this thing online.EmoPulse

The same constraints that are forcing smart watch designers to get creative with their displays are also forcing them to come up with something better for these things to do than save you the three seconds it takes to get your phone out and read a text message. For example, the wrist is a logical place to put the kind of RFID chips that allow “digital wallets”—just touch your watch to the payment pad, and you’re done. Or maybe your watch helps you not to forget your keys, wallet and anything else that’s critical, as you run out the door. Or even, maybe your smart watch makes it less likely you’ll be shot with your own gun. The possibilities are endless, and that’s probably what keeps backers coming back to smart watch projects on Kickstarter. Whether or not the mega-corporations rolling them out will find ways to answer these needs with their mass market products remains to be seen.

Demand for laptops is so weak that analysts have declared all of 2013 a “write-off”


By Christopher Mims @mims

July 10, 2013

Demand for laptop computers is so weak in the first half of 2013 that the analysts at IHS iSupply have declared it virtually impossible that the overall market for laptop and desktop PCs will grow in 2013 over 2012. It’s the same death-of-the-PC-story we’ve heard before, only now the infection has spread to laptops as well. The numbers:

  • 6.9% drop in laptop shipments between the first and second quarters of 2013. That’s twice the 3.7% drop seen in 2002 after the dot-com bust.
  • Compare that to a 41.7% increase in laptop shipments from Q1 to Q2 of 2010. Typically, the second quarter of the year sees a sharp uptick in purchases of notebook computers, a bounce-back after soft demand in the beginning of the year.
  • 2013 will be the second year in a row in which PC shipments shrank overall. Unless a miracle happens in the second half of 2013, the PC industry is going to have to face the fact that its decade of expansion, from 2001 to 2011, is over.

The culprit in all of this? “Media tablets,” says iSupply. And those are only becoming more versatile at the high end, more affordable at the low end, and more popular overall. Given those trends, could 2014 be the third year in a row that PC sales decline? It would be unprecedented, but manufacturers can’t rule it out.



Report: Use of coal to generate power rises

Miami Herald

Tribune Washington Bureau

Posted on Wed, Jul. 10, 2013

Power plants in the United States are burning coal more often to generate electricity, reversing the growing use of natural gas and threatening to increase domestic emissions of greenhouse gases after a period of decline, according to a federal report.

Coal’s share of total domestic power generation in the first four months of 2013 averaged 39.5 percent, compared with 35.4 percent during the same period last year, according to the Energy Information Administration, the analytical branch of the Energy Department.

By contrast, natural gas generation averaged about 25.8 percent this year, compared with 29.5 percent a year earlier, the agency said in its most recent “Short-Term Energy Outlook.”

With coal prices dropping and gas prices rising, the agency said it expected the use of coal to remain on the upswing, accounting for 40.1 percent of electricity generation through 2014. Natural gas would fuel about 27.3 percent.

Power plants are the single largest source of greenhouse gases that drive climate change. The growing use of coal is occurring against the backdrop of President Barack Obama’s announcement of a sweeping plan to reduce greenhouse gases, including curtailing emissions from power plants. His initiative has already sparked opposition from the coal industry, congressional Republicans and coal-state politicians.

Opponents say new regulations are unnecessary in part because utilities have relied more on natural gas, which emits less heat-trapping carbon dioxide than coal does. But the new data indicate that power plants will readily return to coal if the price of natural gas gets too high.

“Markets on their own may go in your direction for a period of time, but to ensure that we get reductions in greenhouse gas emissions in a significant, sustained way, you’re going to need government intervention,” said James Bradbury of the World Resources Institute, a Washington think tank.

The energy administration estimated that carbon dioxide emissions from fossil fuels would rise by 2.4 percent in 2013 and 0.6 percent in 2014, after falling about 3.9 percent in 2012.

“The increase in emissions over the forecast period primarily reflects the projected increase in coal use for electricity generation, especially in 2013 as it rebounds from the 2012 decline,” the report said.

In a speech last month, Obama directed the Environmental Protection Agency to propose rules by June 2014 to cut greenhouse gas emissions from power plants. A rule for new power plants is expected by September.

Coal-fired generation accounted for about 50 percent of the electricity produced in the U.S. about a decade ago. But a natural gas boom driven by hydraulic fracturing has pushed down prices, making natural gas more competitive with coal. By April of last year, coal and natural gas each produced about one-third of the country’s power.

Lower demand for coal drove down its average price, said Elias Johnson, a coal industry expert for the agency. At the same time, the price of natural gas ticked upward, buoyed by demand and somewhat reduced production.

Utilities, many of which have natural gas and coal plants, will probably toggle between the two fuels in the near term, burning the cheaper one more often.

“What is the least expensive form of generation gets dispatched first: renewables, hydro, then maybe nuclear and then coal or natural gas,” said Karen Obenshain of the Edison Electric Institute, a utility trade group in Washington.

Coal is not expected to grab a 50 percent share of power generation again because new regulations curtailing mercury emissions from power plants will probably shutter many small, older coal plants, said Mark McCullough of American Electric Power, one of the country’s largest coal-fired utilities. Even with such closures, the U.S. will probably fail to sharply reduce greenhouse gas emissions by 2020, a goal set by Obama in 2009, without a comprehensive effort to address carbon dioxide pollution.

Said Bradbury, “Electricity markets are very dynamic, and while there’s been a lot of press about the success story of the benefits of natural gas, it’s important to realize that that’s temporary and it depends on gas prices staying really low, and we’re starting to see there are these thresholds where utilities will switch back to higher-carbon fuel, like coal.”


Does Wearable Tech Have A Place In The Enterprise?

Posted by Dan Swinhoe

on July 04 2013

This week saw the first Pebble smartwatches selling online. Sony, Acer, Google, Apple, Foxconn and Samsung are all working on their own versions. The era of wearable tech is within sight.

According to Juniper research, almost 70 million smart wearable devices will be sold in 2017, and the market will be worth more than $1.5 billion by 2014. ST Liew, president of Acer’s smartphone group, told Pocket-Lint “We are looking at wearable, I think every consumer company should be looking at wearable.” While that might be true, should enterprises be doing the same?

Right now wearable tech is mostly for sporty types; heart rate monitors, fancy pedometers, HUD for skiers etc. But soon the market will be flooded with a tidal wave of smartwatches and Google Glass. And while this will no doubt affect how companies collect user data, develop apps and interact with consumers, will we be seeing workers around the office or datacenter wearing them?

Rose-Tinted Google glass?

Smartwatches probably won’t be essential to any enterprise mobility program, merely a notification tool with additional security pains to account for. But despite being banned in many places before it’s even released, Google Glass is getting plenty of people excited.

So far most of it has been on the consumer side of things. Some doubt whether it could ever be used for the enterprise, while others think it’s the best thing since sliced bread (or the Cloud at least). Chris Hazelton of 451 Research told Computerworld it would be the next step in Mobility & BYOD trends, which would eventually help drive its acceptance.

Fiberlink have jumped on board early, offering its MaaS360 platform to IT admins through the device, and said that since most EMM and MDM platforms support Android already, much of the hard work is already done. Meanwhile Dito, a company that provides services for Google applications, have promised enterprise apps for Glass  (AKA Glassware) by late 2013/early 2014. The company’s co-founder, Dan McNelis, explained at the E2 conference that one of its clients was looking at building information modelling, or BIM, applications, which could help construction workers on site check schematics and that everything was in the write place/angle.

Along with construction, Glass has been cited as a hands-free tool for utility workers while dealing with high voltage, or as a potential HUD for pilots, and possibly even real-time polling.

Though facial recognition might be banned, the core concept of early Glassware apps MedRef – brining up a person’s medical records instantly – highlights the potential wearable gear has within the healthcare industry. Whether it’s tracking nurses with RTLS (Real-Time Location Systems) or better diagnosis and delivery methods, or even live from the operating table, hospitals could be wearable tech’s first home outside the sports ground.

It’s not just glasses and watches that are entering the enterprise. A smart bracelet for workers at risk of being kidnapped has been developed, sending pre-set warnings to social media and other workers in the area, while Motorola has developed some heavy duty engineering gear more tailored towards their needs and is also customizable. A new smartring has been developed by Chinese company  Geak, which has great potential for being a very useful security/authentication tool. I can see far more of a market for specially tailored wearable tech arising once the bluster over Glass & smartwatches has died down.

So does wearable tech have a place in business, or is it just another consumer procrastination device? I think some do, especially if they’ve been custom-made for the purpose. But I doubt we’ll be seeing an office full of smart this and wearable that.The future success of the likes of Google Glass or any number of future smartwatches will depend entirely on the quality of the hardware & apps provided, and the imagination of those using them.

 I also agree with Hazelton’s view that BYOWD (Bring-Your-Own-Wearable-Device) will be an important factor.


Quinoa should be taking over the world. This is why it isn’t.

Washington Post

By Lydia DePillis, Updated: July 11, 2013


In the Andean highlands of Bolivia and Peru, the broom-like, purple-flowered goosefoot plant is spreading over the barren hillsides–further and further every spring. When it’s dried, threshed, and processed through special machines, the plant yields a golden stream of seeds called quinoa, a protein-rich foodstuff that’s been a staple of poor communities here for millennia. Now, quinoa exports have brought cash raining down on the dry land, which farmers have converted into new clothes, richer diets, and shiny vehicles.

But at the moment, the Andeans aren’t supplying enough of the ancient grain. A few thousand miles north, at a downtown Washington D.C. outlet of the fast-casual Freshii chain one recent evening, a sign delivered unpleasant news: “As a result of issues beyond Freshii’s control, Quinoa is not available.” Strong worldwide demand, the sign explained, had led to a shortage. A Freshii spokeswoman said that prices had suddenly spiked, and the company gave franchises the choice to either eat the cost or pull the ingredient while they renegotiated their contract.

Quinoa is a low-calorie, gluten-free, high-protein grain that tastes great. Its popularity has exploded in the last several years, particularly among affluent, health-conscious Americans. But the kinks that kept the grain out of Freshii that day are emblematic of the hurdles it will face to becoming a truly widespread global commodity and a major part of Americans’ diet. It shows the crucial role of global agribusiness, big-ticket infrastructure investment, and trade in bringing us the things we eat, whether we like it or not.

In short, it’s hard to keep something on the menu if you might not be able to afford it the next day. And the American agricultural economy makes it hard for a new product to reach the kind of steady prices and day-in-day-out supply that it takes to make it big.


A grain whose time has come

Quinoa went extinct in the United States long before upscale lunch places started putting it in side salads. Agronomists have found evidence of its cultivation in the Mississippi Valley dating back to the first millennium AD, but it faded away after farmers opted for higher-yielding corn, squash, and bean crops.


Enthusiasts started growing quinoa again in the 1980s, mostly in the mountains of Colorado. It’s not easy, though–sometimes it takes several seasons to get any harvest, since seeds can crack, get overtaken by weeds, or die off because of excessive heat or cold. In 2012, the U.S. accounted for a negligible amount of the 200 million pounds produced worldwide, with more than 90 percent coming from Bolivia and Peru.

Demand started to ramp up in 2007, when Customs data show that the U.S. imported 7.3 million pounds of quinoa. Costco, Trader Joe’s, and Whole Foods began carrying the seed soon after, and the U.S. bought 57.6 million pounds in 2012, with 2013 imports projected at 68 million pounds. And yet, prices are skyrocketing; they tripled between 2006 and 2011, and now hover between $4.50 and $8 per pound on the shelf.

What’s driving the increase? Part of it is that Peru itself, already the world’s biggest consumer of quinoa, patriotically started including the stuff in school lunch subsidies and maternal welfare programs. Then there’s the United Nations, which declared 2013 the International Year of Quinoa, partly in order to raise awareness of the crop beyond its traditional roots.

But it’s also about the demographics of the end-user in developed countries–the kind of people who don’t think twice about paying five bucks for a little box of something with such good-for-you buzz. A few blocks away from Freshii in Washington D.C. is the Protein Bar, a four-year-old Chicago-based chain that uses between 75 and 100 pounds of quinoa per week in its stores for salads and bowls that run from $6 to $10 each (Their slogan: “We do healthy…healthier”).

Right now, the company has so far decided to absorb the higher prices, which still aren’t as much of a cost factor as beef and chicken. It will even pay a little extra to ship the good stuff from South America, rather than the grainier variety that Canada has developed.

“As much as I don’t like it–you never want to pay more for your raw materials–it’s central to our menu,” says CEO Matt Matros. “I’m pretty positive that as the world catches on to what a great product is, the supply will go up and the price will come back down. It’ll come down to the best product for us. If we find that the American quinoa is as fluffy, then we’ll definitely make the switch.”

Cracking the quinoa code

The Andean smallholders are trying to keep up with the demand. They’ve put more and more land into quinoa in recent years; Bolivia had 400 square miles under cultivation last year, up from 240 in 2009. The arid, cool land that quinoa needs was plentiful, since little else could grow there. And thus far, that trait has made it difficult to grow elsewhere.

But that doesn’t mean the rest of the world isn’t trying. A Peruvian university has developed a variety that will grow in coastal climates. There are also promising breeding programs in Argentina, Ecuador, Denmark, Chile, and Pakistan. Washington State University has been developing varieties for cultivation in the Pacific Northwest, and in August will hold a quinoa symposium bringing together researchers from all over to talk about their work.


“To me, the imagination is the limit, and a whole lot of effort,” says Rick Jellen, chair of the plant and wildlife sciences department at Brigham Young University. “Quinoa is a plant that produces a tremendous amount of seed. So you have potential, with intensive selection, to identify variants that have unusual characteristics.”

The South American quinoa industry, and the importers who care about it, are worried about the coming worldwide explosion of their native crop. Despite a bubble of media coverage earlier this year about how strong demand is making it difficult for Bolivians to afford to eat what they grow, it’s also boosted incomes from about $35 per family per month to about $220, boosting their standards of living dramatically. Now, the worry is maintaining a steady income level when production takes off around the world.

Sergio Nunez de Arco, a native Bolivian who in 2004 helped found an import company called Andean Naturals in California, likes to show the small-scale farmers he buys from pictures of quinoa trucks in Canada to prove that the rest of the world is gaining on them, and that they need to invest in better equipment. Meanwhile, he’s trying to develop awareness about the importance of quinoa to reducing poverty, so that they can charge a fair trade price when the quinoa glut comes.

“The market has this natural tendency to commoditize things. There’s no longer a face, a place, it’s just quinoa,” de Arco says. “We’re at this inflection point where we want people to know where their quinoa is coming from, and the consumer actually is willing to pay them a little more so they do put their kids through school.”

He’s even helping a couple of Bolivian farmers who don’t speak English very well fly to that Washington State University conference, so they’ll at least be represented.

“It kind of hurts that the guys who’ve been doing this for 4,000 years aren’t even present,” de Arco says. “‘You guys are awesome, but your stuff is antiquated, so move over, a new age of quinoa is coming.'”

Why isn’t the U.S. growing more of it?

So far, though, the mystery is why the new age of quinoa is taking so long to arrive.

Americans have been aware of the crop for decades, and used to produce 37 percent of the world supply, according to former Colorado state agronomist Duane Johnson. It never took off, partly because of pressure from advocates of indigenous farmers–in the 1990s, Colorado State University researchers received a patent on a quinoa variety, but dropped it after Bolivian producers protested it would destroy their livelihoods.

You don’t need a patent to grow a crop, of course. But the switching cost is extremely high, says Cynthia Harriman of the Whole Grains Council. “Can you get a loan from your bank, when the loan officer knows nothing about quinoa? Will he or she say, ‘stick to soybeans or corn?'” It even requires different kinds of transportation equipment. “If you grow quinoa up in the high Rockies, where are the rail cars that can haul away your crop? Or the roads suitable for large trucks?”

All that infrastructure costs money, and the only farmers with lots of money are in industrial agribusiness. But U.S. industry has shown little interest in developing the ancient grain. Kellogg uses quinoa in one granola bar, and PepsiCo’s Quaker Oats owns a quinoa brand, but the biggest grain processors–Cargill and Archer Daniels Midland–say they’ve got no plans to start sourcing it. Monsanto, the world’s largest seed producer, has nothing either.

Instead, their research and development dollars are focused entirely on developing newer, more pest-resistant forms of corn, soybeans, wheat, sugar, and other staples. All of those crops have their own corporate lobbying associations, government subsidy programs, and academic departments devoted to maintaining production and consumption. Against that, a few researchers and independent farmers trying to increase quinoa supply don’t have much of a chance.

“This is something where it would truly have to come from the demand side–no one wants to get into this and get stuck with all this excess inventory,” says Marc Bellemare, an agricultural economist at Duke University. And how do you determine how much demand is enough, or whether a fad has staying power? “We still haven’t fully unbundled what the decision bundle is. It’s like shining a flashlight in a big dark room.”

That’s why it’s hard for any new crop to make the transition from niche to mainstream. Products, maybe: Soy milk is ubiquitous now, after years as a marginal hippie thing, but it comes from a plant that U.S. farmers have grown for decades. An entirely new species is something else altogether. “I wouldn’t even go so far as to say that’s a non-staple that went big-time,” Bellemare says.

For that reason, quinoa prices are likely to remain volatile for a long while yet. Brigham Young’s Rick Jellen says the lack of research funding for quinoa–relative to the other large crop programs–means that even if they come up with a more versatile strain, it won’t have the resilience to survive an infestation.

“Once that production moves down to a more benign environment, you’re going to get three or four years of very good production,” he predicts. “And then you’re going to hit a wall, you’re going to have a pest come in, and it’s going to wreak havoc on the crop. I think we’re going to see big fluctuations in quinoa prices until someone with money has the vision and is willing to take the risk to invest to really start a long-term breeding program for the crop.”

Which means that if you’re looking forward to a quinoa lunch in downtown D.C., be prepared for a disappointment.


Defcon founder’s message to feds fair to some, hypocritical to others

Dis-invitation is interesting because last year Defcon opened with General Keith Alexander, director of the National Security Agency

Jaikumar Vijayan

July 12, 2013 (Computerworld)

Defcon founder Jeff Moss’ request to government agencies asking them not to attend next month’s annual Defcon hacker conference has evoked a mixed response from the security community.

Many see it as little more than a symbolic gesture meant to convey the hacker community’s discomfort over recent revelations of government surveillance activities by fugitive document-leaker Edward Snowden.

Others though see it as somewhat hypocritical move by an organization that has for long prided itself on giving a platform for all members of the security community to exchange ideas and share information freely.

Two researchers from the network security-consulting firm Secure Ideas on Thursday announced that they would not present at Defcon as scheduled, to protest Moss’ actions.

Moss launched Defcon 21 years ago and has overseen its growth into one of the industry’s largest hacker conferences. On Wednesday, he published a blog post in which he asked government agencies to “call a time-out” from the conference.

“For over two decades Defcon has been an open nexus of hacker culture, a place where seasoned pros, hackers, academics, and feds can meet, share ideas and party on neutral territory. Our community operates in the spirit of openness, verified trust, and mutual respect,” he wrote.

“When it comes to sharing and socializing with feds, recent revelations have made many in the community uncomfortable about this relationship,” he said in asking them not to attend Defcon this year.

The dis-invitation is interesting because it was only last year that Defcon had opened with a keynote from General Keith Alexander, director of the National Security Agency, the entity at the center of the surveillance controversy.

“Jeff Moss’s post was a statement, not an order, but it was an important one,” said Michael Sutton, a vice president of security research with Zscaler.

Moss is well respected within both the black hat and white hat communities and has strong government connections in his role as an advisor to the U.S. Department of Homeland Security (DHS), Sutton noted.

“His statement illustrates the deep disappointment of the Defcon community, who feel that they were blatantly lied to in light of the PRISIM scandal,” he said referring to Alexander’s denials last year when asked at the conference if the NSA was spying on U.S. citizens.

“Jeff is standing up for the community by saying ‘you disrespected us in our own house — we’d prefer you not visit this year’,” Sutton said.

For many at Defcon, Edward Snowden’s recent revelations of widespread NSA surveillance activities are likely to have only reinforced their suspicion of all things government, said Richard Stiennon, principal at IT-Harvest.

With Defcon, there’s always been a bit of the “young generation versus the Man,” Stiennon noted. In recent years, NSA and other three-letter government agencies have been recruiting from Defcon ranks, leading to a gradual thawing in relations between the two communities, he said. Even so, members of the Defcon community have only shown a “wary willingness” to interact with government types at best.


That willingness likely has been tested by the Snowden affair, Stiennon noted. “A group of security professionals who are aligned to doing things and creating things that are protective of security and privacy and going to find themselves at odds with the NSA. So it may be best for both sides to cool off a bit,” Stiennon noted.

Lawrence Pingree, an analyst at Gartner cautioned against making too much of Moss’ statement. From a publicity standpoint, it makes a certain amount of sense to ask federal agencies not to attend Defcon, considering the sentiments that have been aroused by Snowden’s revelations, he said.

In reality, it is unlikely that Moss will want to, or will even be able to stop government security types from attending the event if they really want to, he said.

In the end Moss is just sending a gentle reminder to the government that they are likely to be less than welcome among those at Defcon considering recent revelations about PRISM, said Robert Hansen, a white hat hacker and director of product management at WhiteHat Security.

“I don’t believe that anyone who works directly with the staff at Defcon really hates feds,” said Robert Hansen, a white hat hacker and director of product management at WhiteHat Security. “What they hate are that the free and open Internet has been taken from them in some sense and that theft is embodied in some sense by the people who are tasked with fulfilling often secret laws.”

“The only issue I see with Jeff’s announcement is that a lot of the most important, die-hard, freedom advocates work in or work directly with the military industrial complex, and it’s unfair to paint them as the enemy of hackers,” Hansen noted. “Though Jeff has never said that directly, and I don’t believe he feels that way, I worry that people less familiar with the situation would mis-represent his words.”

Others though see Moss’ stance as needlessly politicizing the annual hacker fest.

In a blog post, James Jardine and Kevin Johnson, two researchers from Secure Ideas, announced they would not present at Defcon this year citing Moss’ statement about not wanting the government at the show, as the reason.

“The basis of our decision, is that we feel strongly that Defcon has always presented a neutral ground that encouraged open communication among the community, despite the industry background and diversity of motives to attend,” the blog noted. “We believe the exclusion of the ‘fed’ this year does the exact opposite at a critical time.”

Ira Winkler, president of the Information Systems Security Association, and a Computerworld columnist said that Moss was being unfair in asking the federal government not to attend Defcon.

Much of Defcon’s popularity has stemmed from the effort put into making it completely neutral venue for the information security community. By asking the government to stay away, Defcon has lost some of that neutrality, he said.


The surveillance activities revealed by Snowden, and that Moss alluded to in his statement, have all been found to be completely legitimate and vetted by all three branches of the government. So rather than try and exclude government agencies, it would have been better to use Defcon as an opportunity to get more answers on the surveillance practices, he said.

“It would be better to have a legitimate discussion on the issue,” Winkler said. “Why is it legal, why is it constitutional. Stopping a group of people from attending goes against the spirit of what Defcon is supposed to be,” he said.

Defcon has always thrived on presenting controversial security topics and has gone out of the way to make it possible for people to do so, Winkler noted.

“Why is the government being singled out when no group has been singled out and prevented from speaking,” he said.

July 6 2013




Joint Chiefs Chair: Fewer Admins Needed

Military Network Consolidation Should Help Mitigate Insider Threat

By Eric Chabrow, June 29, 2013.



A side benefit of the Department of Defense’s continuing consolidation of some 15,000 U.S. military networks will be the need for fewer systems administrators; that should make IT less vulnerable to insider threats, the chairman of the Joint Chiefs of Staff says.

Army Gen. Martin Dempsey’s remarks come as federal authorities hunt for Edward Snowden, the former systems administrator at the Defense Department’s National Security Agency who leaked classified information about NSA intelligence collection programs .

“I think systems administrators is the right place to begin to clean this up, though, because they have such ubiquitous access, and that’s how he ended up doing what he did,” Dempsey said in a June 27 speech at the Brookings Institution, a Washington think tank. “We’re got to take a much harder look at this as we become more reliant on cyber-activity.”

Air Force Lt. Col. Damien Pickart, a Pentagon spokesman, says the military has thousands of systems administrators, but he couldn’t provide a precise number.

Dempsey, in his presentation, compared the Snowden leak to the insider attacks on American troops by allied Afghani soldiers. “You can’t prevent; you can mitigate the risk,” he said.

“You can’t stop someone from breaking the law 100 percent of the time. You can certainly increase the scrutiny in terms of their background investigation. You can reduce the number of them. You can put different degrees of oversight in place.”


Building Its Own Secure, 4G Network

Dempsey said the DoD’s previously announced network consolidation effort, known as the Joint Information Environment, would increase security and help ensure the integrity of battle systems in the face of disruption. The new environment, based on secure cloud, will include a 4G wireless network that will provide network access for certified smart phones and tablets. “In fact, I have a secure mobile phone with me here today,” the general said, holding up a smart phone. “The phone would make both Batman and James Bond jealous.”


Mobile devices connected to military networks must meet stringent DoD guidelines [see DoD's Influence on Smart Phone Security].

Dempsey touched on a wide range of cybersecurity concerns during his hour-long presentation.

Rules of Cyber-Engagement: The chairman said the military has developed a draft of a playbook that describes how the United States should respond to a cyber-attack on the nation’s critical infrastructure by taking specific steps. Those include:

1. Gather information on the malicious code and the systems under attack. “Our first instinct will be to pull up the drawbridge and prevent the attack, that is to say, block or defend.”

2. Launch an active defense if the attack cannot be repulsed. Dempsey characterized that response as being a proportional effort to disable the attacking botnet.

3. If that fails, consult with other “higher-level” authorities in the government to determine what to do next. Any massive retaliation would require decisions by civilian leaders, he said.

Cyber-Attack Response: Dempsey said a cyber-attack on the United States could, conceivably, be met with a conventional military response. “There is an assumption out there … that a cyber-attack that had destructive effects would be met by a cyber-response that had destructive effects. That’s not necessarily the case. I think that what [President Obama] would insist upon, actually, is that he had the options and the freedom of movement to decide what kind of response we would employ.”

Negotiating with the Chinese: Dempsey dismissed the idea that Snowden’s disclosures of the cyber-intelligence collection programs weaken America’s moral standing in cybersecurity negotiations with the Chinese. He said all nations, including the United States and China, conduct espionage in a variety of domains, including cyber, but China has developed a particular niche of stealing intellectual property.

“Their view is there are no rules in cyber, there are no laws that they are breaking, there are no standards in behavior. So, we have asked them to meet with us … in order to establish some rules of the road so that we don’t have these friction points in our relationship.”

Future of Cyber Command: The chairman envisions a day when the U.S. Cyber Command, a sub-unified command under the U.S. Strategic Command, becomes its own command. He said the current structure works, for now, but added that passage of cyberthreat information sharing legislation before Congress could change that. “If we get the kind of information sharing we need, that could be a catalyst for changing the organization, because the span and scope of responsibility will change.”

Hack-Back Opposition: The general said he opposes private companies launching their own counter-attacks against cyber-assailants – so-called hack-back attacks [see Questioning the Legality of Hack-Back]. “We don’t want private cyber-organizations conducting operations that could be perceived as hostile acts. And, if they’re perceived as hostile acts, it could lead us into conflict.”



U.S. Looks to Blunt Corporate Espionage by Chinese Firms


BEIJING—The U.S. could be signaling stepped-up prosecution of Chinese companies accused of stealing trade secrets as it filed criminal charges against one of China’s largest wind-turbine manufacturers and two of its executives, experts said.

“Maybe five years ago, it was sexier to chase drug cases than trade-secret cases,” said Benjamin Bai, a partner at Allen & Overy in Shanghai. However, “the political climate is brewing the perfect storm in the U.S. for prosecutions to increase.”

A recent law strengthening the U.S. Economic Espionage Act will likely encourage more prosecutions, said Mr. Bai, who has represented U.S. clients on intellectual-property issues.

In December U.S. President Barack Obama signed into law an amendment that allows prosecutors to seek charges against those who steal the trade secrets of not only products but also of services. The amendment was passed after a former Goldman Sachs Group Inc. computer programmer had his conviction overturned when a U.S. court ruled that the software he stole was used only internally.

The latest case, filed Thursday, involved products. U.S. prosecutors accused Sinovel Wind Group Ltd. of stealing source code for software used to control wind turbines from American Superconductor Corp., a Massachusetts-based engineering company, and then shipping four turbines equipped with the code to customers in the U.S.

“This case is indicative that American companies and the U.S. government are fed up, and can and should pursue all available legal remedies, including criminal sanctions, to put an end to trade-secret theft,” said James Zimmerman, managing partner of law firm Sheppard Mullin Richter & Hampton LLP in Beijing and a former chairman of the American Chamber of Commerce in China. “Chinese companies need to heed the warning that U.S. industry is determined to protect its core technology.”

A spokeswoman for Sinovel said Friday that the company was still studying the charges and didn’t have a formal response to allegations. U.S. prosecutors also indicted an employee of AMSC, as the Massachusetts firm is known, and two Sinovel executives. They couldn’t be reached for comment.

A spokeswoman for China’s Ministry of Foreign Affairs said she wasn’t familiar with details of the Sinovel case. “China has been enhancing the laws for intellectual-property protection and will continue to improve laws and regulations for intellectual property and copyright,” she said.

The issue has risen in prominence in recent months as companies reported that they had been hacked by groups that appeared to have connections with the Chinese government. The Obama administration has been pressing the issue of commercial hacking with Chinese officials.


China has repeatedly said it is a victim of cyberattacks, and points to the allegations of former U.S. National Security Agency contractor Edward Snowden as proof that the U.S. hacks into Chinese computers.

In an annual survey, members of the American Chamber of Commerce in China said they were becoming increasingly concerned about intellectual-property theft in China. Thirty-four percent of respondents said intellectual-property infringements caused “material damage” to their China operations last year, up from 22% a year earlier and 9% in 2010, the survey said. One in four said they experienced a breach or theft of data or trade secrets from their China operations, the chamber said.

A U.S. commission on intellectual-property theft, headed by former Director of National Intelligence Dennis Blair and former U.S. Ambassador Jon Huntsman, issued a report in May accusing China of being responsible for as much as 80% of the intellectual-property theft against U.S. companies. The commission made several recommendations, including increasing resources for the Justice Department and the Federal Bureau of Investigation to investigate and prosecute cases of trade-secret theft.

Louis Schwartz, president of China Strategies, a consulting firm focusing on Chinese trade and investment in renewable energy, said U.S. prosecutions against China for intellectual-property theft are still rare and that he was surprised at how aggressive AMSC was in pursuing its claims against Sinovel.

“My advice for clients is to see if you can get enough compensation upfront so if you lose intellectual-property rights in China, at least you have some compensation,” he said. “The lure of the China market is so great that people tend to look the other way.”

Xiang Wang, Asia managing partner for the law firm Orrick, Herrington & Sutcliffe LLP, said the number of criminal cases his firm has handled involving corporate espionage charges leveled at Chinese companies by foreign rivals has increased by 50% over the past two years. “The number of criminal cases will certainly increase,” he said.

Although companies pay fines in civil disputes over the theft of trade secrets, the penalties often aren’t effective deterrents, said Mr. Wang, who is based in Beijing. However, someone who steals trade secrets may think twice about going to prison, he said.

A version of this article appeared July 1, 2013, on page B9 in the U.S. edition of The Wall Street Journal, with the headline: U.S. Acts Tough on Trade Secrets.


How Edward Snowden Could Derail the Global Economy

By DAVID FRANCIS, The Fiscal Times July 2, 2013

NSA leaker Edward Snowden is at it again. This time, he’s leaking to the Germans.

Over the weekend Der Spiegel magazine published a report indicating that the United States has been spying on its European allies, including Germany, France and Italy. Documents provided by Snowden indicate that NSA is collecting data on European communications and planted bugs in EU offices in New York and Washington to detect rifts in the troubled monetary alliance.

Reactions from European leaders were swift and harsh. This is especially true in Germany, where the protection of private correspondence is written into its Constitution.

President Obama shrugged off the report, saying all nations collect intelligence. But this argument isn’t likely to fly with German Chancellor Angela Merkel, a former East German who grew up living under the intrusive eye of the Stasi and Soviet surveillance. She immediately condemned the United States, while a parliament member said Snowden should be rewarded for this information with asylum in Germany.

“The monitoring of friends — this is unacceptable. It can’t be tolerated. We’re no longer in the Cold War,” Merkel said through spokesman Steffen Seibert.

But the more troubling response, both for the United States and the European Union, came from French President François Hollande. He said that talks on a bilateral U.S.-EU trade deal should be put on hold until questions about the spying were answered.

“We can only have negotiations, transactions, in all areas once we have obtained these guarantees for France, but that goes for the whole European Union, and I would say for all partners of the United States,” he said of the talk set to begin next week. For good measure, French minister of foreign trade Nicole Bricq added, “We must absolutely re-establish confidence… it will be difficult to conduct these extremely important negotiations.”

Things could get worse. In a letter asking Ecuador for asylum, Snowden – who is still stuck at the Moscow airport – said he would release more documents that he deemed to be in the public interest. Russian President Vladimir Putin has also opened the possibility of Snowden remaining there.

There’s a lot at stake. The United States wants it’s economy recovery to accelerate, while Europe is desperate for economic growth (the deal is expected to add $157 billion to the EU economy and $133 billion to the U.S. economy).

Now, low-level European diplomats could leverage NSA’s spying to win concessions as negotiations over the deal get underway. Expect France, which has called for provisions to fund French movies and art in the deal, to be especially aggressive with anti-NSA rhetoric.

One can argue about whether Snowden’s revelations have made America less safe. But it’s undisputable that he has caused an erosion of trust between partners and their citizens. This trust is essential in building international trade agreements, as popular support for trade pacts is essential.

Without these pacts, international trade dries up and hundreds of billions are removed from the global economy. Put simply, these agreements eliminate barriers to doing business. For instance, a 2010 Congressional Research report found that exports to countries that are part of the Trans-Pacific Partnership Agreement totaled $747 billion in 2008.

The mistrust also has political implications. Take Germany. Much was known about PRISM when Obama visited Berlin two weeks ago, but according to the German media, Merkel asked few questions about it. Now that the German public is outraged by Snowden’s latest disclosure and has turned on Obama, Merkel will try to harness that outrage to win election in the fall, causing a further strain on German-American relations.

“This could slow down [the EU-U.S. deal] considerably,” Joerg Wolf, editor of the Berlin-based open think tank, told The Fiscal Times. “European citizens will mistrust the U.S. even more, which then would make it more difficult for EU governments to cooperate with the US in the future.”


The U.S.-EU trade pact is not the only one at risk. The pending trade deal between the United States and Ecuador has fallen apart because of Snowden. Last week, Ecuador – a country thought to be considering offering Snowden asylum- withdrew from talks, saying they feared “blackmail” if they refused to offer up the fugitive.

Senator Chuck Schumer (D-NY) refused to back down, warning Ecuador, “Your economy will pay a very big price. We should end all foreign aid, repeal trade agreements worth billions of dollars.”

Ecuador risks losing billions in exports to the United States if the deal falls apart. And while the dissolution of the deal hurts the Ecuadorian side more, the end of a two-decade old trade deal represents a diplomatic setback for both sides.

Wolf said there is potential for the same thing to happen with the U.S.-EU deal.

“Whatever U.S. public diplomacy achieved in Germany in the last four years [since George W. Bush's presidency ended], it’s gone,”’s Wolf said. “Any positive impact President Obama’s trip to Berlin two weeks ago might have had, it’s gone.”


Snowden’s Real Job: Government Hacker

Mathew J. Schwartz    | July 01, 2013 11:42 AM


How did the apparently low-level IT-administrator-turned-whistleblower Edward Snowden, 30, manage to gain access to details of numerous top secret National Security Agency (NSA) surveillance programs?

Simple: He wasn’t actually an IT or system administrator. Intelligence officials had repeatedly suggested this while also noting that the agency employs numerous contractors to help maintain its systems. Or as Gen. Keith Alexander, the director of the NSA, told ABC News last month about post-Snowden changes at the agency: “We’re now putting in place actions that would give us the ability to track our system administrators, what they’re doing and what they’re taking, a two-man rule, we’ve changed the passwords.”

In fact, Snowden himself stated in a video that his most recent job title wasn’t that of system administrator. “My name is Ed Snowden, I’m 29 years old. I worked for Booz Allen Hamilton as an infrastructure analyst for NSA in Hawaii,” he told Guardian journalist Glenn Greenwald in a video recorded in Hong Kong and broadcast after he’d asked to be identified as the source of the leaks involving Prism and other surveillance programs. Prior to that job, Snowden said, “I’ve been a systems engineer, systems administrator, senior adviser for the Central Intelligence Agency, solutions consultant, and a telecommunications information system officer.”

Many commentators read “infrastructure analyst” as NSA-speak for a system administrator role, and many news reports of Snowden’s leaks actually labeled him as being a sysadmin. But according to intelligence officials, the infrastructure analyst role refers to a position tasked with finding new ways to hack into foreign networks, to keep the NSA abreast of the signals intelligence it’s charged with gathering.

Why hasn’t Snowden’s real role been highlighted to date? Principally because government officials haven’t wanted to highlight the online espionage equivalent of breaking and entering into other countries’ networks and telecommunications equipment, according to a Sunday report in The New York Times that cited no sources.

That revelation finally explains how 30-year-old Snowden came to possess official documents relating to some of the country’s most sensitive surveillance programs, including intercepts of online audio, emails and video (Prism), traffic analysis of cell phone calls (Mainway), Internet metadata collection (Marina), and telephone content interception (Nucleon), not to mention secret court orders authorizing the surveillance programs.

Snowden said he took a job in March as a contractor at Booz Allen Hamilton — reportedly taking a pay cut — to gain access to the documents that he’s since leaked.

“My position with Booz Allen Hamilton granted me access to lists of machines all over the world the NSA hacked,” Snowden told The South China Morning Post prior to leaving Hong Kong for Moscow last week. “That is why I accepted that position about three months ago.”

Snowden now remains in limbo — his U.S. passport has been revoked — in the transit area of Moscow’s Sheremetyevo airport.

Regardless, expect the leaks to keep on coming. Greenwald has said that Snowden leaked thousands of documents, of which at least dozens are newsworthy.

Most recently, the Guardian Sunday released documents that it said detailed NSA operations against 38 embassies and missions, which were labeled as targets. That target list included not just Middle Eastern countries, but also U.S. allies such as France, Greece, Italy, Mexico, Japan and South Korea. Detailed interception methods included bugs planted in fax machines used to transmit official cables between the European Union embassy in Washington and EU headquarters in Brussels, as well as antenna intercepts and taps in networking cables.


Why the US doesn’t use cyber-weapons to attack its enemies more often

By Kevin J. Delaney    @kevinjdelaney    June 30, 2013


The US government doesn’t like to talk about it, but it has developed an arsenal of cyber-weapons that can be used to attack adversaries. Why doesn’t it deploy computer worms and other technologies for disrupting enemies’ networks and computer-controlled physical infrastructure more often in conflicts around the world?

Mike McConnell, vice chairman at government contractor Booz Allen Hamilton and former head of the National Security Agency and US director of National Intelligence, says the US has the best capabilities in the world for cyber-attacking and “can do some pretty significant damage if we choose to.”

But the government hesitates because it’s wary of making itself an even bigger target for cyber-attacks against the US, according to McConnell. Speaking at the Aspen Ideas Festival (co-organized by Quartz’s sister publication The Atlantic), he elaborated:

“Let’s say you take an action. We depend on this stuff more than anyone else. We’re more vulnerable than anybody else in the world. If we could put a map up here of the world with the US on center and we put bandwidth on top of it, it’s a bell curve. Most of the communications in the world flow through the United States and we are the biggest user and beneficiary. So there’s a great hesitancy to use anything in a cyber context because it’s relatively easy to punch back in a pretty aggressive way.

So every discussion I’ve ever participated in is ‘You’ve got some options. Well, let’s just hold those aside and consider other options.’ But we could do major damage.”

McConnell, who’s credited with building US capabilities for fighting cyber threats, notes that the purpose of the US Cyber Command unit of the Defense Department is both offense and defense. “And the offense will get a lot of attention,” he adds.

The US reportedly has used cyber-weapons against Iran’s nuclear enrichment efforts as part of a program code-named “Olympic Games.” The Stuxnet computer worm developed by the US and Israel reportedly succeeded in disrupting the centrifuges Iran was using to purify uranium in its pursuit of nuclear weapons. (Retired US general James Cartwright is now reportedly the target of a Justice Department investigation into the leaking of information about Stuxnet to the press.)


Cyber-Attackers Constantly Hitting Gas Utilities in 2013


By Robert Lemos | Posted 2013-07-01


Unknown attackers have targeted the Internet-connected systems of natural-gas companies, using brute-force attacks to attempt to access the companies’ business and process-control networks, according to a report published last week by the Internet Control System Cyber Emergency Response Team (ICS-CERT).

The incidents, which occurred in January and February, were first reported to the ICS-CERT, a component of the U.S. Department of Homeland Security, in late February, the group stated in its quarterly public report on cyber threats. Following the initial report and a subsequent warning from the ICS-CERT, more critical infrastructure companies came forward with news of other incidents.

“The companies reporting this activity operate gas compressor stations across the Midwest and Plains states within the US, although some of the attempts reported were solely against business networks,” the report stated. “While none of the brute force attempts were successful, these incidents highlight the need for constant vigilance on the part of industry asset owners and operators.”

The last attack occurred on February 23, according to the report. Yet, while the ICS-CERT claimed that no new attacks have been detected, it’s unlikely that the attacks have stopped altogether, Tommy Stiansen, chief technology officer and co-founder of threat-intelligence firm Norse, said in an e-mail interview.

“Today all public facing IP addresses are attacked on a regular basis, but the questions are really by whom and how targeted and sophisticated are the attacks,” he said. “While there may be an element of failure to report, it may be that some of these installations are compromised but admins remain unaware due the stealthy nature of the compromise.”

Recent research published by security firm Trend Micro found that Internet-connected industrial-control systems are frequently targeted by online attackers. The company’s researchers set up fake industrial control systems, made them appear valuable and logged 39 attacks over 28 days against the spoofed systems, the company stated in its report.

While the U.S. called out China for its attacks against sensitive industries, the attacks detected by Trend Micro have come from Internet addresses in 14 different nations. IP addresses in China accounted for about a third of the attacks, while Laos and the United States came in second and third, respectively.

The experiment, which occurred in 2012, underscores that attackers are continuously probing these important systems. While the ICS-CERT reportedly informed industry members of the specific IP addresses that were involved in the attacks, creating block lists based on such quickly changing attributes does not work very well, Norse’s Stiansen said.

“The use of IP block lists described in the report often give admins a false sense of security,” Stiansen said. “Today cyber criminals can setup and launch attacks using botnets and other compromised hosts, quickly changing the IP address and obfuscating the location of the actual attackers.”


Obama: U.S. will give allies info on spying claims

Seattle Times


AP White House Correspondent

Originally published July 1, 2013 at 11:04 AM | Page modified July 1, 2013 at 3:44 PM

Facing a European uproar over more U.S. eavesdropping claims, President Barack Obama argued Monday that it’s no surprise that governments spy on each other but said the United States will provide allies with information about new reports that the National Security Agency bugged European Union offices in Washington, New York and Brussels.

The latest revelations were attributed in part to information supplied by former NSA contractor Edward Snowden. Obama on Monday also said the U.S. has held “high-level” discussions with Russians to get Snowden out of a Moscow airport and back to the United States to face criminal charges.

Obama, in a news conference with Tanzanian President Jakaya Kikwete, pushed back against objections from key allies over a report in the German news weekly Der Spiegel that the United States installed covert listening devices in EU offices. He suggested such activity by governments is not unusual.

“We should stipulate that every intelligence service – not just ours, but every European intelligence service, every Asian intelligence service, wherever there’s an intelligence service – here’s one thing that they’re going to be doing: they’re going to be trying to understand the world better and what’s going on in world capitals around the world,” he said. “If that weren’t the case, then there’d be no use for an intelligence service.

“And I guarantee you that in European capitals, there are people who are interested in, if not what I had for breakfast, at least what my talking points might be should I end up meeting with their leaders. That’s how intelligence services operate,” Obama added.

European officials from Germany, Italy, France, Luxembourg and the EU government itself say the revelations could damage negotiations on a trans-Atlantic trade treaty between the EU and the United States. Agreeing to start those talks was one of the achievements reached at meetings last month in Northern Ireland between Obama and the European members of the Group of Eight industrialized economies.


Obama said the NSA will evaluate the claims in the German publication and will then inform allies about the allegations.

At the same time, he tried to reassure allies such as German Chancellor Angela Merkel, French President Francois Hollande and British Prime Minister David Cameron that he relies on personal relationships, not spying, to determine what other leaders have on their minds.

“I’m the end user of this kind of intelligence,” he said. “And if I want to know what Chancellor Merkel is thinking, I will call Chancellor Merkel. If I want to know President Hollande is thinking on a particular issue, I’ll call President Hollande. And if I want to know what, you know, David Cameron’s thinking, I call David Cameron. Ultimately, you know, we work so closely together that there’s almost no information that’s not shared between our various countries.”

Obama’s remarks came shortly after Hollande demanded on Monday that the United States immediately stop any eavesdropping on European Union diplomats.

Obama also said law enforcement officials in the U.S. and Russia were working to find a way to get Snowden back to the United States, where he is charged with violating U.S. espionage laws. The U.S. does not have an extradition treaty with Russia. Moreover, Russia has claimed Snowden is not technically on their soil because, while he is in the transit terminal of the Moscow airport, he has not passed through immigration. The U.S. has revoked his passport.

“We are hopeful that the Russian government makes decisions based on the normal procedures regarding international travel and the normal interactions that law enforcement has,” Obama said.


Identifying the Top Threat Actors

Ex-FBI Investigator on New Targets

by Jeffrey Roman, July 1, 2013. Follow Jeffrey @ISMG_News


Cyber-attacks continue to hamper organizations, says former FBI investigator Shawn Henry. And the actors waging the attacks are targeting organizations for more than just fraud, he says.

The three main groups launching cyber-attacks, Henry says: organized crime, foreign intelligence services and terrorists. And while these groups haven’t changed much over the past 12 months, their techniques have.

“They’ve become more capable,” Henry says during an interview with Information Security Media Group [transcript below]. “They’ve become more sophisticated.”

As organizations work to improve their defenses, threat actors have reacted by becoming more flexible and adaptable, he adds.

“The reality is: The offense outpaces the defense, so they’ve been able to adapt and to overcome, even against what we would consider to be some of the most resilient defenses,” Henry says.


Threat Actors’ Targets

The financial services sector continues to be the most-targeted by organized crime, which aims to pilfer sensitive information that can quickly be monetized, Henry says.

“We’ve also seen some denial-of-service attacks against networks where there might be groups or individuals that are looking to make some type of a social or political statement,” he says (see Are DDoS Attacks Against Banks Over?).

Foreign intelligence services, on the other hand, target multiple sectors, from government to manufacturing and energy to communications, in order to extract data that can be shared with industries in their specific countries, Henry says.

The other main threat: terrorist organizations, which seek to disrupt critical infrastructure and cause harm to, in particular, the United States, he says.

Henry, who left the Federal Bureau of Investigation in March 2012, is now the president of cybersecurity firm CrowdStrike. While at the FBI, he oversaw international computer crime investigations involving DDoS attacks, bank and corporate breaches and state-sponsored intrusions. Over the course of his 24-year career, Henry worked in three FBI field offices, as well as in the bureau’s headquarters. He oversaw the posting of FBI cyber-experts in police agencies around the world, including the Netherlands, Romania, Ukraine and Estonia.


FBI Background

TRACY KITTEN: Tell us about the work that you did with the FBI.

SHAWN HENRY: I worked with the FBI for 24 years and had a number of different positions. In my last position, I was responsible for all cyber-investigations worldwide, as well as criminal investigations and critical incident response. On the cyber-side, I focused primarily on breaches into networks by criminal groups, organized crime groups, terrorist organizations and foreign intelligence services. That included exfiltration of data where the adversary was reaching into a network to pull out data that they see of value, whether it be intellectual property, research and development, corporate strategies, financial data, as well as denial-of-service attacks against networks and other types of breaches where an adversary is looking to wreak some havoc on an organization or on the victim network.

KITTEN: When did you join CrowdStrike?

HENRY: I left the FBI in March of 2012 and joined CrowdStrike the following week, so April 2012. I’ve been there about 13 months now.


Evolving Cybersecurity Landscape

KITTEN: How would you say the cybersecurity landscape has evolved or changed in the last 12 months?

HENRY: It’s an interesting question. I don’t think the landscape has actually changed much. I think the exact same threats that were here when I left the Bureau are still here. What has changed is the awareness of the private sector. There’s a lot more that’s occurred here in the media that has gone out publicly. People have become aware of the threats. Organizations have begun to recognize the impact that they face and the real damage that can be inflicted, and that’s not been out publicly for many years in my service in the government. That really is the most significant change, the awareness. But the activities have not significantly changed.


Top 3 Threats Facing Banks

KITTEN: What would you say are the top-three threats banking institutions face?

HENRY: The financial services sector is probably, in my opinion, among the best protected sectors regarding networks. What we see primarily facing the financial services sector is theft of PII, personally identifiable information; primarily organized crime groups who are targeting networks, trying to steal data which they can very quickly monetize. They do that regularly. They target not only the corporate networks, the financial networks, but individuals as well, trying to capture credentials, usernames and passwords so that they can access accounts. We’ve also seen some denial-of-service attacks against networks where there might be groups or individuals that are looking to make some type of a social or political statement. They recognize that western society and the United States as a whole relies substantially on the financial services sector, so it really is seen as a target of the west, as a symbol of the west and the prosperous United States of America. It’s oftentimes a target of those types of groups.


Threats to Other Sectors

KITTEN: What about other sectors, including government?

HENRY: They’re similar; they’re not the same as what the financial services sector faces. As it relates to government and other sectors, there are oftentimes foreign intelligence services that are looking to pilfer data which they can then share with their industries in their countries, so that they have some type of a competitive advantage. Certainly, the financial services sector is not immune to that. They do get breached by foreign intelligence services that are looking for financial strategies, are interested in mergers and acquisitions, and are interested in partnership deals the financial services sector might be facilitating or enabling. But they’re not the primary threat. It’s those organized crime groups. The foreign intelligence services are hitting every sector in the country: government, military, defense contractors, manufacturing, energy and communications. It really cuts across all sectors. The other group that’s a significant threat is terrorist organizations that seek to potentially disrupt critical infrastructure and to cause harm to the United States.


Threat Actors

KITTEN: How have the actors who are waging some of these attacks changed in recent years?

HENRY: I don’t know that they’ve changed drastically. The same types of groups that I put into three different buckets – organized crime, foreign intelligence services and terrorists – remain primarily the same. We’ve seen these hacktivist groups which I would really kind of put in the terrorist bucket, but the groups themselves haven’t changed. Their capabilities have changed. They’ve become more capable; they’ve become more sophisticated. They’ve had to become flexible and they’ve had to adapt their capabilities as defenses have gotten better and as organizations have become more aware or more resilient in their defense. But the reality is the offense outpaces the defense, so they’ve been able to adapt and to overcome even what we would consider to be some of the most resilient defenses.


Nation-State Attacks

KITTEN: Would it be fair to say that attacks backed by nation-state are posing greater worries today?

HENRY: Again, I think it really depends. It depends on who you are. If you’re a defense contractor that’s developing certain military capabilities for the next-generation war fighter, nation-state is a significant worry to you. If you’re a financial organization, you’re probably more concerned about the organized crime group because that’s the thing that might impact your bottom line. If there’s a $10 million loss, it’s going to be on the balance sheet. People are going to see it, maybe the media. It’s going to pose a risk to the organization’s reputation. That’s going to pose a risk to their operations and customer confidence sometimes. It really depends who you are what the greatest worry should be to you.

Some have said that nation-states, because they’re so closely tied to the U.S. economy, that they would not necessarily take destructive actions, where a crime ring might attack a company and pose some type of a threat to data, to destroying data, in exchange for some type of a monetary reward. We’ve actually seen extortions where companies have had their networks breached, and they’ve then contacted the company and said, “We’ll be happy to turn your data over for a $150,000 consulting fee. And if you don’t, then we’re going to destroy your data.” It really depends on who the organization is [regarding] what their greatest threat to them is.

KITTEN: Would you say that the lines that divide these groups are blurring?

HENRY: We have seen some overlap. It’s not always clear. It used to be clearer in the past, but I think that the capabilities of some of these organized crime groups are such that they actually approach the maturation level of foreign intelligence services. They’re very, very capable; they’re not just kids [moving] around on the network. These are organized, methodical and well-practiced, so I do think that there’s a bit of a blur. It’s also not always clear that the groups and some of the individuals in the groups aren’t crossing lines perhaps, working for the government on one hand and then perhaps on the weekends moonlighting and doing some work for themselves.

International Investigations

KITTEN: Do you see international investigations improving?

HENRY: Every cyber-investigation for the most part has some international nexus. There’s something that either originates or ends internationally or transcends an international point. From an information-sharing perspective, it requires good coordination and good lines of communication, both in the private sector and in the government sector. In terms of identifying who the adversaries are, there needs to be what I call actionable intelligence sharing, where government-to-government they’ve got to share indicators that will help to identify who the adversary is and if they can arrest them or take some type of action to thwart the attack.

From the private sector perspective, there needs to be sharing even within the same companies that have international capabilities across many countries. They need to share information, actionable intelligence, so that they can better defend themselves and provide a better defense. I do think it’s improving. Again, the awareness piece is really important, the fact that more organizations have visibility into these types of attacks and they have a sense of understanding about what the impact is. That encourages better sharing of actionable intelligence.


Information Sharing Challenges

KITTEN: What are some of the challenges facing information sharing?

HENRY: There are a couple things. One, I think the sharing between the government and the private sector is still not as robust as it needs to be. There are a lot of reasons for that. One is the lines are not clearly drawn of exactly what companies need and what governments need. There’s a national security perspective sometimes, so it’s difficult to share classified information. That’s a bit of a challenge. Companies still have some concerns that by sharing information they’ll be revealing unnecessarily to the public that there’s a problem on their network, causing some type of a lost confidence potentially with their client base. All in all, I think it is getter better. There’s still a long way to go, but it’s certainly one of the bigger challenges.

The last piece I’ll add is the concern people have about privacy. What are we sharing? I would argue that there’s no need to share content. You don’t need to share Word files; you don’t need to share content of e-mails; you don’t need to share spreadsheet information. What you need to share is a lot of the technical data, what we call indicators if there’s been a compromise, the signatures of malware and the types of information that would help to identify how an attack occurred and who might have launched an attack, but does not compromise the actual content of data. But that privacy piece is rightfully so a concern by many people, both public and private, and it does cause some consternation when you talk about information sharing because people just need to be educated about what that information really is.


DDoS Attacks

KITTEN: What can you tell us about what’s likely going on behind the scenes?

HENRY: Let me first say that I do not have any inside information about what’s occurring. Any of my comments relate specifically to my observations from the outside looking in. But I think that there’s certainly a cause for concern any time there’s some disruption of service; or when people have difficulty accessing their networks, there’s a cause for concern. I would think that the FBI would be looking quite closely to try to determine where the attacks are coming from. I think that they would likely be working with international partners, both in the intelligence community as well as in the law enforcement community, to try and identify what the sources of attacks are. Once you can determine where those attacks are coming from, you can take law enforcement action to disrupt the networks that are launching those attacks and actually disrupt the people that are causing those attacks. Through the execution of search warrants or arrest warrants and the like, that’s going to be an action that’s going to help mitigate the threat, by actually taking the bad actors off of the playing field.


New defense cyber policies are in the works


By Amber Corrin

Jul 02, 2013


DOD’s cyber strategy from 2011 is due for an update.


The Defense Department is working on a new plan for operating in cyberspace, policy that would update the official strategy first released two years ago this month.

Army Maj. Gen. John Davis said that while significant progress has been made in the military’s cyber domain so far, there remains work to be done and that two years in real time could equal 20 years in cyberspace’s boundless landscape.

“Senior leaders in the department and beyond the department understand that cyber is a problem [and] cyber is important,” Davis said at an industry event in Baltimore on June 28. “They’ve made cyber a priority, and there is a sense of urgency.”

In July 2011, then-Deputy Defense Secretary William Lynn rolled out the defense strategy for operating in cyberspace, which encompasses five strategic pillars. They included the establishment of cyberspace as an operational domain and plans to ready forces to go with it; the introduction and employment of new operational concepts on DOD networks; cross-agency partnerships, including with the Homeland Security Department; the build-up of cyber defenses, coordinated with allies and international partners; and the capitalization of U.S. resources, including technology, workforce and rapid innovation.

Davis highlighted progress over the past two years, including the launch of service cyber components operating under U.S. Cyber Command, joint cyber centers at the combatant commands, military orders for cyber operations, an interim command and control framework, cyber force structure and training exercises.

As a result, Davis said, the U.S. military has been able to refine its role in cyberspace, as well as the partnerships that support it. That role is shared with agencies like DHS and Justice Department, he added, as well as the private sector and global collaborators.

“We have three main cyber missions, and three kinds of cyber forces will operate around the clock to conduct those missions,” Davis said.

He noted that national forces will act to counter cyber attacks on the U.S., while a second, larger group of combat mission forces will help execute military operations and integrate digital capabilities into the DOD arsenal. The largest set of cyber operators will operate and defend critical Pentagon networks.

“We will deter, disrupt and deny adversary cyberspace operations that threaten vital U.S. interests when approved by the president and directed by the secretary of defense,” he said. “If a crippling cyber attack is launched against our nation, the Department of Defense must be ready for an order from the commander in chief to act.”


Can DIA become the Defense Innovation Agency?

By Amber Corrin

Jul 01, 2013



DIA Director Lt. Gen. Michael Flynn, shown here speaking at a meeting in 2012, says DIA has to ‘break down legacy walls’ to emerge as an innovation leader. (Army photo)

“Innovation” is pretty big as far as government buzzwords go. Often found hand-in-hand with “doing more with less,” the idea of innovation is one that seems to excite leaders and frequently surfaces in briefings and talking points.

But what is actually getting done? Who is really out there taking innovation for a spin?

The intelligence community might not immediately seem like the leader, but if the Defense Intelligence Agency has anything to say about it, it is in the driver’s seat.

“The agency was established to understand the plans of the adversary. In the process of that, we became an organization that focuses on planning,” said Dan Doney, DIA’s chief innovation officer. “Planning and sticking to the plan are the opposite of innovation. Innovation is deviating from the plan, rapidly moving it and changing. When it comes to innovation, we haven’t had a great reputation. Put that in the past.”

Today, DIA is moving forward with what officials call an entirely new approach — one that eradicates traditional silos, aggressively seeks outside input and quickly implements new ideas. The objective is to avoid strategic surprise and push forward with a new way of conducting the government’s intelligence business after more than a decade of war.

“We have to break down legacy walls,” DIA Director Lt. Gen. Michael Flynn said at the agency’s June 27 Innovation Day at its Washington headquarters. “Innovation is not just about technology — it’s about a lot of things. Pay close attention to how this organization has adjusted and what we’re working toward.

Training, education and professional development are a big deal. We can’t lose the gains from the last decade.”

The biggest area of focus in DIA’s new approach is information sharing. The year-old Intelligence Community IT Enterprise (ICITE) is designed to link the 17 intelligence agencies and significantly reduce IT costs.

Among DIA’s contributions to ICITE is the Innovation Gateway, an online information-sharing environment that forges easier connections between the intelligence community and industry. The goal is to help the agencies identify and adopt better solutions, particularly those that might not have surfaced without such a system. “No more middle man,” Doney said. “No more cloudiness, no more special relationships where you need to know the right person to know what we’re looking for.”


Reaching out to industry

Users access the gateway via DIA-granted public-key infrastructure credentials. Much of the environment uses a wiki-style format, which is fitting for a system designed to free people from the constraints of standardization.


“There are some tricks associated [with] making a coherent environment that isn’t coherently planned, but the first place you show up to…is a wiki, where you’re made aware of the capabilities of the space,” Doney said. “Underneath that is a description for how a technology provider participates in the space…and they’re able to offer their technologies within the space, they’re able to link to a set of core services” for others to access.

To protect against potential theft of intellectual property, the system has safeguards in place in the form of what Doney called stakeholder management authorization.

“One of the advantages is that providers host their own [areas], so they maintain complete control over their own capabilities, technologies and configurations,” Doney said. “They’re not constrained by other people’s ways of doing things.”

The gateway is still in its early stages, but officials expect initial operating capability in December. Reaching critical mass in terms of users will be the key to the project’s success and progress, Doney said. He likened getting there to the early days of Wikipedia, when the popular online resource relied on passionate early adopters to generate content.

“The first phase is the hardest,” he said. “It takes action and energy to get over the initial barriers.”


Measuring results

Another tricky aspect of the Innovation Gateway is measuring success, particularly by traditional government models, officials said.

“There’s not going to be one way to look at it and either you passed or failed. It’s going to be all-encompassing,” said Janice Glover Jones, DIA’s CIO. She added that productivity is one key marker, but success writ large will be a little tougher to define. “And even if you fail, you get lessons learned.” Doney, too, indicated that the focus should be on the less tangible goals.

“Innovation is a means, not an end,” he said. “We’re driven by mission, so the bottom line is how much better are we able to execute on our mission, not whether or not we’re the best innovators in the world. It’s a mistake many folks make to measure the means instead of the ends.”

Still, he pointed to his “stretch goals” as crucial milestones, such as the 30-day transition requirement that reforms discovery, evaluation, integration and acquisition. Other goals are characterized by the existence of cross-functional teams of analysts, collectors and technologists who work together seamlessly; rapid access to tools and capabilities; and the availability of mission-relevant data.

The goals illustrate the agency’s wider drive to get back on track after the wars in Iraq and Afghanistan, which drastically changed the business of government.

“We’ve exercised the wrong muscles. We have to go back to a place where we exercise the right muscles and think about how we conduct business,” Flynn said. “We have to allow for failure…. It’s a risk, but that’s OK. The return on investment on risk is so good.”


NIST Unveils Draft of Cybersecurity Framework

Executives Given Key Role in Voluntary Framework

By Eric Chabrow, July 3, 2013


The cybersecurity framework, ordered by President Obama, will emphasize the importance of senior executives in managing programs to secure their enterprises’ information systems and assets, according to a draft of the cybersecurity framework released by the National Institute of Standards and Technology.

“By using this framework, these senior executives can manage cybersecurity risks within their enterprise’s broader risks and business plans and operations,” says the draft dated July 1, but made public a day later.

In February, Obama issued an executive order directing NIST, working with the private sector, to develop a framework to reduce cybersecurity risks that the mostly private operators of the nation’s critical infrastructure could adopt voluntarily [see Obama Issues Cybersecurity Executive Order].

NIST concedes much more work must be done by the time the final version of the framework is issued next February. Among the areas NIST identifies that need to be addressed in the framework are privacy and civil liberties standards, guidelines and practices as well as helpful metrics for organizations to determine their cybersecurity effectiveness.

“We want to provide something that has flexibility, that can be implemented by different sectors,” Donna Dodson, chief of NIST’s computer security division, said in an interview with Information Security Media Group prior to the draft’s release [see Fulfilling the President's Cybersecurity Executive Order]. “We want it to be specific in other ways so that we are sure we are working to reducing cybersecurity risks in the critical infrastructure.”


5 Core Cybersecurity Functions

The framework, according to the draft, will revolve around a core structure that includes five major cybersecurity functions, each with its own categories, subcategories and information references. The five functions include Know, Prevent, Detect, Respond and Recover.

The Know function, for instance, would include a category entitled “know the enterprise risk architecture” with subcategories of “understand corporate risk tolerance” and “identify risk assessment methodologies,” as well as others. An information reference, in this instance, would link to guidance such as NIST Special Publication 800-53: Security and Privacy Controls for Federal Information Systems and Organizations and ISO 31000: Risk Management.

The framework also will include three implementation levels that reflect organizational maturity in addressing cybersecurity. Incorporated into the framework will be a user’s guide to help organizations understand how to apply it as well as a compendium of informative references, existing standards, guidelines and practices to assist with specific implementation.


Framework as a Guide, Not Detailed Manual

NIST says the framework should not be seen as a detailed manual, but as a guide to help executives, managers and staff to understand and assess the cybersecurity capabilities, readiness and risks their organizations face, as well as identify areas of strength and weakness and aspects of cybersecurity on which they should productively focus.

Some 240 entities including major technology and security vendors, trade groups, local and state governments, not-for-profit organizations and individuals this past spring submitted to NIST their ideas on IT security best practices to incorporate into the framework. NIST held a workshop in late May in Pittsburgh, where it reviewed the submissions and started to create the framework. Another workshop is scheduled for July 10-12 in San Diego, where the framework will be refined.

“Many comments advised that the cybersecurity framework would not be effective unless the very senior levels of management of an organization were fully engaged and aware of the vulnerabilities and risks posed by cybersecurity threats and committed to integrating cybersecurity risks into the enterprise’s larger risk management approach,” according to the draft.

“Time and again, comments reflected that these senior executives, including boards of directors, need to integrate and relate cybersecurity concerns and risks to critical infrastructure to the organization’s basic business and its ability to deliver products and services,” the draft says. “It is clear that these officials are best positioned to define and express accountability and responsibility, and to combine threat and vulnerability information with the potential impact to business needs and operational capabilities.”



Funding is up in the air for FAA control tower upgrade

The NextGen GPS air control plan, which would cut airline delays and fuel consumption, costs $40 billion that sequestration puts at risk.

LA Times

By Hugo Martín

7:53 PM PDT, June 30, 2013


Air travel in the future will be faster, cleaner and less expensive if the Federal Aviation Administration’s $40-billion overhaul of the nation’s air control system is completed.

That’s a big if.

With the federal sequestration fight in Washington, FAA officials say funding for the seven-year project could be in jeopardy.

The plan, known as NextGen, replaces outdated radar-based technology with global positioning systems and digital communications to modernize the country’s air control system.

By allowing pilots to fly more direct routes and giving air traffic controllers more accurate and up-to-date information, the system is expected to cut airline delays 41% by 2020, compared with the delays without NextGen, according to a new report by the FAA.

The efficiencies in the system are also forecasted to save 1.6 billion gallons of fuel and cut 16 million metric tons of carbon dioxide emissions, with $38 billion in cumulative benefits to airlines, the public and the FAA, the report said.

A key component of the system is that air traffic controllers using GPS will get more precise information on planes’ locations and speeds, allowing controllers to better manage the 7,000 or so planes in the air at any given time, according to the FAA. Because the current radar system is slower and less precise, controllers must add a bigger safety cushion of separation between planes.

In a recent speech, FAA Administrator Michael Huerta slammed lawmakers for failing to reach an agreement on future spending plans.

“Because of the financial uncertainty, we can hope for the best, but we have to plan for the worst,” he said. “This is not a sustainable course of action, and it’s no way to run a government.”


Good news for Internet junkies who fly out of Los Angeles International Airport: You’ve got plenty of flights equipped with wireless Internet to choose from.

Fliers departing from LAX to San Francisco International Airport or John F. Kennedy International Airport have the greatest number of daily domestic flight offering Wi-Fi, according to a new study by the travel site

Between LAX and San Francisco, fliers can choose from 31 daily flights that offer Wi-Fi, the study found. Between LAX and JFK, air travelers can pick 27 daily flights with Wi-Fi service, according to the study.

“Wi-Fi is not going away,” said John Walton, director of data for “Passengers want it and will pay for it.”

Overall, 38% of the domestic flights in the U.S. offer Wi-Fi, a number that has been growing 5% to 7% a year, he said. But the rate will likely surge as more airlines install satellite-based Wi-Fi on more of their fleet, Walton said.

“In the next quarter, I imagine we will see a situation where satellite Wi-Fi should be rolling out in big numbers,” he said.

• Airlines in the U.S. lose or damage about 140,000 bags a month, or about three for every 100,000 passengers.

It’s a relatively small loss rate. Still, several companies hope to capitalize on the frustration travelers feel waiting by the baggage carousel only to realize their luggage didn’t make it onto their flight.

Next week, Los Angeles company GlobaTrac plans to begin shipping a palm-sized device that travelers can toss into their bags to track luggage via the Internet or a smartphone app. The device, called Trakdot, sells for $50, plus fees from

Meanwhile, European aerospace company Airbus announced last month that it is producing luggage with built-in technology that allows passengers to track their bags. The luggage even includes a built-in scale to tell whether it is exceeding the maximum weight limits.

No word on the cost of the high-tech suitcase, but Airbus is reportedly considering letting airlines rent the bags to passengers.


Another potential roadblock for UAS integration in the USA


by Gary Mortimer • 1 July 2013


As I watch the UA story in America unfold from afar the more it seems to be like a game show. Phil from The Amazing Race has just thrown up another potential roadblock for the teams.

Last Thursday the Senate Appropriations Committee approved the 2014 transportation funding bill which asks that the FAA is to stop the rule making process until privacy concerns have been addressed.

This process is expected to take at least one year. It first has to pass through the Senate and House.

I have said it before if you want to start in the UA industry and you are from the USA probably best to move overseas. Your ideas will not be subject to ITAR and a viable commercial market exists.

Lets hope test site selection continues in parallel with this process along with all the other bits and pieces other countries seem to have managed more than 5 years ago.

Look out 2047 here we come. After the break scenes from next weeks episode.

From Senate Appropriations Committee:

The development of unmanned aerial systems [UAS] offers benefits in a wide variety of applications, including law enforcement and border patrol, precision agriculture, wildfire mapping, weather monitoring, oil and gas exploration, disaster management, and aerial imaging. The UAS industry also presents an opportunity for substantial domestic job growth.

The FAA is taking important steps toward integrating UAS into the national airspace, including implementing a UAS test site program to help the agency gather critical safety data. The expanded use of UAS also presents the FAA with significant challenges. The Committee is concerned that, without adequate safeguards, expanded use of UAS by both governmental and nongovernmental entities will pose risks to individuals’ privacy.

The FAA has recognized the importance of addressing privacy concerns by requiring that UAS test sites have privacy policies in place before test flights begin. However, as the FAA looks to integrate UAS into the national airspace, a more comprehensive approach to privacy may be warranted. The United States Constitution, Federal, and various State privacy laws apply to the operation of UAS, but in consideration of the rapid advancement of technology in this area, the Committee questions whether current laws offer sufficient protections to adequately protect individuals.

FAA’s oversight and regulatory authority over the national airspace places the agency in a position to work with other agencies on addressing privacy concerns. To that end, the Committee directs the FAA to collaborate with other Federal agencies in evaluating the impact that broader use of UAS in the national airspace could have on individual privacy.

Furthermore, the Committee includes bill language that prohibits the FAA from issuing final regulations on the integration of UAS into the national airspace until the Secretary submits a report detailing the results of such collaboration. The Committee expects this report to address the application of existing privacy law to governmental and non-governmental entities; identify gaps in existing law, especially with regard to the use and retention of personally identifiable information by both governmental and non-governmental entities; and recommend next steps in how the FAA or other Federal agencies can address the impact of widespread use of UAS on individual privacy. The Committee directs the FAA to submit this report to the House and Senate Committees on Appropriations not later than 1 year after enactment of this act.


Some states may lose Air National Guard flying units

By Brian Everstine

Staff writer

Jun. 26, 2013 – 06:00AM |

Air Force


F-16 Fighting Falcons from the Colorado Air National Guard arrive at a training base in northern Jordan as part of an exercise June 6. Budget realities may force the Air National Guard to shut down some aircraft squadrons.F-16 Fighting Falcons from the Colorado Air National Guard arrive at a training base in northern Jordan as part of an exercise June 6. Budget realities may force the Air National Guard to shut down some aircraft squadrons.

Budget realities could force the Air National Guard below its guiding principle of one flying unit per state, and state leaders could be OK with that.

During last year’s budget deliberations, the Guard and state leaders drew a line on cuts, focusing on its “capstone principle” of one unit capable of flying missions per state, while the Air Force was targeting the component for cuts.

But the realities of sequestration and an uncertain budget future may mean the closure of some aircraft units, although state leaders say they would agree to the cuts as long as there are enough assets available regionally to respond to natural disasters.
“If they don’t have (the assets), they are interested in making sure that at least regionally, they can access it very quickly,” said Heather Hogsett, the director of the homeland security committee at the National Governors Association.

Currently, all states have an aircraft unit, although there has been a push at increasing the number of Air National Guard units with cyber missions such as intelligence, surveillance and reconnaissance, which could replace flying units.

The idea came up during the third public meeting of the National Commission on the Structure of the Air Force on June 26. The congressionally mandated commission, created in last year’s defense spending bill, is made up of members nominated by the president and Congress. It will produce a report next year on the future force structure needs of the service.

Members of the commission spoke critically of the need for each state to have a flying unit. Les Brownlee, a former acting undersecretary of the Army, asked state leaders directly: “Why does a governor need an F-16?”

Maj. Gen. Tim Orr, the adjutant general of Iowa, said the 132nd Fighter Wing in his state was one of two units that lost their F-16s under last year’s spending bill. He said other parts of the wing — including medical units, maintenance, security forces, other support units — were crucial to his state, along with having pilots and their F-16s able to deploy for the Air Force.

“It’s the other capabilities in the wing that are crucial,” he said. “In (2008) floods, we used all personnel to accomplish the mission on the ground.”

Brownlee responded by saying the state has Army National Guard units that would be more effective on the ground in disasters than fighter pilots.

“We had so much flooding that it took the whole Air National Guard and Army National Guard,” Orr said. “That’s not uncommon.”

He said the need for fighters in the Guard is because “we’re the reserve of the Air Force. We have to have that same capability and capacity.”

State leaders are currently meeting with officials from the Federal Emergency Management Agency to look at the top five possible catastrophes that could occur, and then see what assets would need to be available to respond, Orr said. The report is expected this fall and could help drive debates on the allocation of Air National Guard resources.

Brownlee also highlighted the recent decision to have Air Force Reserve crews work under the Air National Guard in response to homeland disasters and under the direction of a Guard dual status commander. This overlap in abilities begs the question, “Is it time to think of a hybrid of the Reserve and Guard?” he said.

Leaders on Capitol Hill and in the Pentagon should start having that discussion and begin a national debate on if that is possible and the right thing to do, Orr said.


Wi-Fi That Sees Through Walls

Thomas Claburn


June 29, 2013


Researchers at MIT’s Computer Science and Artificial Intelligence Laboratory have come up with a way to create a low-power, portable device that can see through walls using Wi-Fi signals.

Technology of this sort, similar in concept to radar or sonar, has existed for years and relies on radio waves and other parts of the electromagnetic spectrum. It is used mainly in law enforcement and military applications, ideally when the law allows.

Wall penetration systems have become common enough that the U.S. Department of Justice last year funded a market survey of what’s known as “through-the-wall sensors,” or TTWS.

Security products maker Camero-Tech, for example, offers its Xaver line of through-wall imaging devices for defense and law enforcement applications. But with prices at about $9,000 for the handheld Xaver 100 and $47,500 for the 7 lb. Xaver 400, these aren’t consumer products.

The legality of TTWS technology is sufficiently unclear that ManTech Advanced Systems International, the company that prepared the market survey, recommends those planning to use TTWS equipment seek legal advice in advance.

In 2001, the U.S. Supreme Court ruled that the use of thermal imaging to monitor what’s going on inside a private home violates Fourth Amendment protection against unreasonable searches. But as the ability to see through walls reaches the civilian market, this legal boundary is likely to be tested again.

There is at least one consumer TTWS device on the market already, STI’s Rex Plus, an $80+ device that can be placed against a wall/door in order to sound an alarm when someone approaches the opposite side of the wall/door.

Dina Katabi, a professor in MIT’s Department of Electrical Engineering and Computer Science, and graduate student Fadel Adib propose wider civilian use of the technology through a simple, affordable device like a mobile phone, equipped with two antennas and a receiver.

In an email, Katabi suggested the technology, which she calls WiVi, can be used for virtual reality and gaming, without requiring the user to remain in a specific area in front of a sensor. She also says the technology could be used for personal safety.

“For example, if I am walking at night in an isolated area and suspect that someone is following me, hiding behind a fence or around a corner, I can then use WiVi to detect that person and alert myself to the person’s movement,” she said.

Katabi says WiVi can be used for “privacy-preserving monitoring,” such as tracking the movements of elderly people or children without actually having them on camera.

In time, however, improvements in Wi-Fi-based sensing may require a reexamination of the privacy implications of making walls effectively transparent.

“Today the technology does not show body parts or the face of the person,” said Katabi. “Hence it is naturally anonymized. However, as we improve the technology it will start giving higher resolution images for things that one cannot see because they are behind a wall. This will raise privacy related questions. As a society, we still have time to look at these issues and ensure our society has the right policies by the time such high-resolution version of the technology becomes available.”

That future already has been contemplated: University of Tokyo researchers have developed paint that blocks Wi-Fi signals.

Wi-Fi-based sensing appears to be a particularly active area of research at the moment. At the University of Washington, researchers have developed a related technology, WiSee, a Wi-Fi-based gesture sensing system that duplicates the functioning of sensor-based motion detection systems like Leap Motion and Microsoft Kinect without the sensing area limitations.



Statement by Secretary Hagel on the 40th Anniversary of the All-Volunteer Force

DoD Public Affairs

July 1, 2013


Forty years ago today, the Department of Defense implemented one of the most consequential and far-reaching changes in the history of America’s armed forces with the establishment of the all-volunteer force.

In commemorating the birth of our modern military, we recognize and thank the millions of men and women and their families who have served our country — on active duty, in the reserves, and in the National Guard. Skeptics and detractors claimed an all-volunteer force could not be sustained, but these remarkable Americans proved otherwise. They helped win the Cold War, stood against aggression in the Persian Gulf, kept peace in the Balkans, and put their lives on the line in Iraq and Afghanistan in the years since 9/11. They choose to serve our country because they believe in a cause that is bigger than themselves. They’ve always put the interestsof the nation first, as have their families. It’s because of their sense of duty that the American military remains the most trusted institution in our society today.

Our all-volunteer force faces challenges. It will need to rebalance, adapt, and evolve, as it has over the last 40 years. America’s all-volunteer force will continue to make our military the strongest, most capable, and most respected fighting force in the history of the world.



What Are We Going to Do About GPS Jamming?


By Bob Brewin July 1, 2013


Not much, based on this updated report from the Department of Homeland Security.

DHS prepared a classified report on Global Positioning System vulnerabilities in November 2012 and the unclassified version, released last week, leaves much to worry about, including the fact that “Detecting, locating and disabling sources of GPS disruption remain a challenge.”

The department suggests manual backups for GPS, which I imagine includes old-fashioned compasses and maps, but observed that “human skills for using manual techniques could erode due to lack of training and practice as GPS becomes more ubiquitous.”

GPS signals sit at the core of the Federal Aviation Administration’s Next Generation Air Transportation System, provide timing signals for wired and wireless networks, guide precision munitions, help mariners navigate tough harbor approaches and are key to precision farming operations.

But nowhere in the report does DHS suggest an automatic back-up system for the simple reason that one does not exist, even though the Department of Transportation’s John A. Volpe National Transportation Systems Center warned about the dangers of GPS jamming and called for development of an automatic back-up system in a report published 13 years ago.

The Volpe report suggested a terrestrial backup GPS system based on an improved version of the WW II Long Range Navigation System, known as Loran, but the United States abandoned Loran due to the manning costs incurred by the Coast Guard, which literally blew up the tower of the Port Clarence, Alaska, station in 2010.


South Korea, which has a lot of experience with GPS jamming by North Korea, plans to start installing a Loran system in 2016 with full operation planned by 2018 — a better approach than a compass or map.


Government Executive Defense Contractors and Military Personnel Can’t Fill In for Furloughed Civilians


By Charles S. Clark

July 1, 2013


During furlough days set to begin July 8, Defense Department managers may not “borrow military manpower” nor step up assignments to contractors to make up for idled civilian employees, a Pentagon official directed on Friday.

Planning around civilian furloughs and “total force management” requires that “component heads, installation commanders and line managers shall take steps to manage workload, but must ensure that borrowed military manpower is not used to compensate for work resulting from a civilian furlough,” F.E. Vollrath, assistant Defense secretary for readiness and force management, wrote in a memo to the joint chiefs, all undersecretaries and major departmental directors.

Borrowing labor “would be inconsistent with the secretary’s intent and the department’s commitment to protect the viability of the All-Volunteer Force,” he continued. “Additionally, in accordance with the department’s statutory requirements, contractors are prohibited from being assigned or permitted to perform additional work or duties to compensate for the workload/productivity loss resulting from the civilian furlough,” Vollrath wrote.

The policy on contractors was welcomed by the American Federation of Government Employees, whose national president J. David Cox Sr. on Monday issued a statement crediting Vollrath’s position, even though “Congress should have repealed sequestration months ago because it was a failed tactic never intended to be enacted, and Secretary [Chuck] Hagel never should have imposed furloughs on the Department of Defense’s reliable and experienced civilian workforce when there is ample room for reductions in service contract spending that is supposed to be temporary in nature.”

Cox recapped a Monday meeting with Vollrath, during which the union “pressed him to ensure that AFGE’s direct conversion concerns — i.e., when work performed by civilian employees is given to contractors illegally or to military personnel inappropriately — are resolved expeditiously. I asked him to investigate in particular a direct conversion at Anniston [Ala.] Army Depot where core depot maintenance workload is being illegally privatized,” Cox said.

Also on Monday, the Pentagon comptroller released updated fiscal 2013 budget numbers detailing planned cuts under sequestration, as requested in May by Senate Armed Services Committee Chairman Carl Levin, D-Mich., and Ranking Member James Inhofe, R-Okla. The new report lays out line items totaling some $41 billion in reduced spending for fiscal 2013, noting that the cost of preparing the new estimates was $38,000.

Levin was traveling and unavailable for comment.





Members of Top Nine Software Teams Move Forward from DARPA’s Virtual Robotics Challenge

by Staff Writers

Washington DC (SPX) Jul 01, 2013


The DARPA Robotics Challenge (DRC) was created with a clear vision: spur development of advanced robots that can assist humans in mitigating and recovering from future natural and man-made disasters. Disasters evoke powerful, physical images of destruction, yet the first event of the DRC was a software competition carried out in a virtual environment that looked like an obstacle course set in a suburban area.

That setting was the first proving ground for testing software that might control successful disaster response robots, and it was the world’s first view into the DARPA Robotics Challenge Simulator, an open-source platform that could revolutionize robotics development.

Disaster response robots require multiple layers of software to explore and interact with their environments, use tools, maintain balance and communicate with human operators. In the Virtual Robotics Challenge (VRC), competing teams applied software of their own design to a simulated robot in an attempt to complete a series of tasks that are prerequisites for more complex activities.

Twenty-six teams from eight countries qualified to compete in the VRC, which ran from June 17-21, 2013. DARPA had allocated resources for the six teams that did best, but in an interesting twist, good sportsmanship and generosity will allow members of the top nine teams, listed below, to move forward:


1. Team IHMC, Institute for Human and Machine Cognition, Pensacola, Fla. (52 points)

2. WPI Robotics Engineering C Squad (WRECS), Worcester Polytechnic Institute, Worcester, Mass. (39 points)

3. MIT, Massachusetts Institute of Technology, Cambridge, Mass. (34 points)

4. Team TRACLabs, TRACLabs, Inc., Webster, Texas (30 points)

5. JPL / UCSB / Caltech, Jet Propulsion Laboratory, Pasadena, Calif. (29 points)

6. TORC, TORC / TU Darmstadt / Virginia Tech, Blacksburg, Va. (27 points)

7. Team K, Japan (25 points)

8. TROOPER, Lockheed Martin, Cherry Hill, N.J. (24 points)

9. Case Western University, Cleveland, Ohio (23 points)


The top six teams earned funding and an ATLAS robot from DARPA to compete in the DRC Trials in December 2013 (DARPA is also funding several other “Track A” teams to construct their own robot and compete in the Trials). The Trials are the second of three DRC events, and the first physical competition.


In a demonstration of good sportsmanship, Jet Propulsion Laboratory, which also has a DARPA-funded Track A effort with its own robot, decided to merge its two efforts and offer the bulk of the resources it earned in the VRC to other teams. DARPA split the freed resources between the next two teams:

+ The robot associated with the JPL win and some funding now goes to TROOPER (Lockheed Martin).

+ Additional funds are being allocated to a newly formed team of Team K and Case Western. That team, now known as HKU, will use an ATLAS robot generously donated to it by Hong Kong University to participate in the DRC Trials in December.


Thus, in total, seven teams with ATLAS robots and DARPA support will be going to the DRC Trials, where they will compete with other teams with their own robots.

VRC teams were evaluated based on task completion and effective operator control of the robots in five simulated runs for each of three tasks (15 total timed runs) that addressed robot perception, manipulation and locomotion.

The tasks included: entering, driving and exiting a utility vehicle; walking across muddy, uneven and rubble-strewn terrain; and attaching a hose connector to a spigot, then turning a nearby valve. To simulate communications limitations in a disaster zone, the VRC imposed a round trip latency of 500 milliseconds on data transmission, and varied the total number of communications bits available in each run, from a high of 900 megabits down to 60 megabits.

To conduct the VRC, DARPA funded the Open Source Robotics Foundation to develop a cloud-based simulator that calculates and displays the physical and sensory behaviors of robots in a three-dimensional virtual space, in real time. The simulator allowed teams to send commands and receive data over the Internet to and from a simulated ATLAS robot-information very similar to what would be sent between a physical robot and its operator in the real world.

“The VRC and the DARPA Simulator allowed us to open the field for the DARPA Robotics Challenge beyond hardware to include experts in robotic software. Integrating both skill sets is vital to the long-term feasibility of robots for disaster response,” said Gill Pratt, DRC program manager.

“The Virtual Robotics Challenge itself was also a great technical accomplishment, as we have now tested and provided an open-source simulation platform that has the potential to catalyze the robotics and electro-mechanical systems industries by lowering costs to create low volume, highly complex systems.”


Immigration deal would boost defense manufacturers

Washington Post

By Matea Gold, Published: July 1


The border security plan the Senate approved last week includes unusual language mandating the purchase of specific models of helicopters and radar equipment for deployment along the U.S.-Mexican border, providing a potential windfall worth tens of millions of dollars to top defense contractors.

The legislation would require the U.S. Border Patrol to acquire, among other items, six Northrop Grumman airborne radar systems that cost $9.3 million each, 15 Sikorsky Black Hawk helicopters that average more than $17 million apiece, and eight light enforcement helicopters made by American Eurocopter that sell for about $3 million each.

The legislation also calls for 17 UH-1N helicopters made by Bell Helicopter, an older model that the company no longer manufactures.

Watchdog groups and critics said that these and other detailed requirements would create a troubling end-run around the competitive bidding process and that they are reminiscent of old-fashioned earmarks — spending items that lawmakers insert into legislation to benefit specific projects or recipients. In the past several years, Congress has had a moratorium on earmarks.


The language was included in a $46 billion border security package the Senate approved last week as part of a comprehensive immigration bill. The so-called border surge — an additional $38 billion in spending — was added in the final week of negotiations to attract more GOP support for the measure, which passed with 68 votes, including 14 from Republicans.

The legislation would spend $30 billion over the next decade to hire more than 19,000 new Border Patrol agents, an undertaking that would double the size of the force and that many immigration experts consider wasteful and unnecessary.

The measure also would devote $7.5 billion to build 350 miles of fencing along the U.S.-Mexican border and $4.5 billion to buy new border technology. The legislation would have to be fully implemented, along with electronic visa and employment verification systems, before immigrants could receive green cards.

Sens. Bob Corker (R-Tenn.) and John Hoeven (R-N.D.), who co-sponsored the plan, said the provisions were aimed at assuaging the concerns of Republicans who are wary about creating a path to citizenship without tougher border measures.

“I was just trying to work with our caucus to get as many of our guys to participate,” Hoeven said.

That approach did not win over holdouts such as Sen. Tom Coburn (R-Okla.), who said: “Taxpayer funds should enhance border security, not provide border stimulus for contractors. Unfortunately, the Senate bill does exactly that.”

The list of equipment included in the legislation was drawn from a technological needs assessment developed by the U.S. Customs and Border Protection agency in 2010, according to a senior Department of Homeland Security official, who spoke on the condition of anonymity to describe the internal process. Agency staff members compiled the list at the request of Homeland Security Secretary Janet Napolitano after she stopped a virtual-fence project that was plagued by cost overruns and delays.

Border Patrol officials provided the list to congressional staffers who had asked what the agency needed to effectively control the border.

In separate interviews last week, Corker and Hoeven said they decided to add the list to the legislation to help win over GOP senators who did not trust Napolitano to carry out a border plan.

The two senators noted that the proposal would allow Napolitano to substitute equivalent brands of technology as long as she notified Congress within 60 days. “If they want to buy something better, they can,” Corker said.

But critics said that because the measure prescribes specific products, the agency probably would not seek alternatives. “Lawmakers have put their thumb on the scale for particular products and technologies and that is hard for an agency to ignore,” said Steve Ellis, vice president of the nonpartisan Taxpayers for Common Sense, which scrutinizes federal spending.

The $4.5 billion set aside for technology would be a boon for defense contractors, who are looking for opportunities as the United States continues to reduce its presence in Afghanistan.

The parent corporations of the companies that manufacture the products listed in the bill and their employees have given nearly $11.5 million to federal candidates and campaigns since 2009, according to the nonpartisan Center for Responsive Politics. About half of that came from Northrop Grumman.

Neither Corker nor Hoeven has received substantial donations from the companies or the defense sector overall.

“We’re proud of our long partnership with the Department of Homeland Security and are honored they have repeatedly chosen to acquire our helicopters for their important missions,” said Ed Van Winkle, law enforcement sales manager for American Eurocopter. “We stand ready to produce and deliver additional aircraft customized to Customs and Border Protection requirements should Congress authorize and fund their procurement.”

Representatives of Northrop Grumman, Sikorsky and Bell declined to comment.

Most of the equipment required by the legislation is identified by category, not by brand. Among other items, the bill calls for 4,595 unattended ground sensors, 104 radiation isotope identification devices and 53 fiber-optic tank inspection scopes — and specifies how many should be deployed in each Border Patrol sector. It also requires the purchase of four new drones, on top of 10 unmanned aircraft that the Border Patrol already owns.

The items listed by name were identified that way on the border agency’s wish list, according to Senate staff members involved in drafting the plan, who discussed the process on the condition of anonymity. They said the proposal would not override contracting rules that require competitive bidding.

But government watchdogs said it would be difficult to have an open bidding process for equipment identified by brand and model.

“The agency is statutorily required to buy the specific items from the listed vendors,” said Scott Amey, general counsel for the Project on Government Oversight, an independent group that works to expose overspending and corruption. “I’m unsure how an agency could hold a competition.”

One big-ticket item on the list is the VADER radar system, an airborne technology operated from drones that Northrop Grumman developed for the Pentagon’s research arm. The Border Patrol has been testing one of the systems on loan from the Defense Department to detect migrants attempting to cross the border illegally, officials said. This year, the agency received $18.6 million to buy two of the radar systems , and the immigration bill would add six more.

The Black Hawk helicopters required under the plan include five of the latest high-tech models with digital cockpits. As for the American Eurocopter aircraft, the patrol would be required to add eight AS-350 models to the 85 it already has in its fleet.

The legislation spells out how new border patrol agents would be deployed, requiring the agency to assign 38,405 officers to the U.S.-Mexican border by Sept. 30, 2021.

The Border Patrol employs a record 21,000 agents, up from about 10,000 in 2004. In its most recent budget request, the department did not seek new agents.

Many experts on border security say that doubling the force is impractical and a poor use of resources and that the money could be better spent on workplace inspections or the E-Verify system that employers can use to check the citizenship of applicants.

“There is a lot in this border security plan that is fighting the last war,” said Doris Meissner, who was a top Clinton administration immigration official.

Homeland Security officials are confident that they can recruit and train the surge of agents required under the bill. Spokesman Peter Boogaard said the measure would “build on this administration’s historic border security gains.”

Hoeven and Corker said they settled on hiring 20,000 agents in large part because the number fell midway between proposals from other GOP senators.

“I wish I could tell you it was scientific,” Corker said, adding, “We felt like this was something that would get the job done.”

Alice Crites contributed to this report.


What They Told Us: Reviewing Last Week’s Key Polls

Ramussen Reports

Saturday, July 06, 2013


Americans still share the values enshrined in the Declaration of Independence 237 years ago and remain wary of too much government. It’s clear, too, that many aren’t happy with the government they’ve got.

Eighty-one percent (81%) believe “all men are created equal.” Ninety-two percent (92%) agree that all men are “endowed by their Creator with certain inalienable rights, among them life, liberty and the pursuit of happiness.” Seventy-two percent (72%) believe “governments derive their only just powers from the consent of the governed.”

But just 25% of voters think the federal government today has that consent.

More Americans than ever (63%) think a government that is too powerful is a bigger danger in the world today than one that is not powerful enough.

Thirty-eight percent (38%) believe the U.S. Constitution doesn’t put enough restrictions on what government can do. Still, 56% think the foundational document shouldn’t be tampered with, and another 33% believe it needs only minor changing.

Just 47% now believe the United States is a nation with liberty and justice for all, the lowest level measured in six years. Still, 77% say if they could live anywhere in the world, it would be the United States.

Scott Rasmussen’s weekly newspaper column notes that “only one-in-four voters today thinks our government has the consent of the governed.” He adds, “That’s a clear call for our government to change its ways and re-earn the trust of those it is supposed to serve. Those are the kind of attitudes that make the Political Class nervous. The fact that we expect more comes from the fact that we as a nation still embrace the Spirit of ’76.”

So how are representative democracy and our constitutionally guaranteed system of checks and balances working these days?

The U.S. Supreme Court finished its term with big decisions on voting rights, affirmative action and same-sex marriage. Following those rulings, public approval of the Supreme Court has fallen to the lowest level ever recorded in more than nine years of polling.

Just seven percent (7%) of voters think Congress is doing a good or excellent job. This marks the sixth consecutive survey that shows Congress’ positive ratings in single digits. Sixty-five percent (65%) believe that no matter how bad things are, Congress can always find a way to make them worse.

Republicans gained the edge over Democrats on the Generic Congressional Ballot for the week ending June 30, but this is the second time in the past three weeks that support for both sides was below 40%, something that hasn’t happened since June 2009. This is the 11th straight week that the parties have been separated by two points or less.

At the same time, voters like the balance of Congress having final review over the Executive branch’s decisions when it comes to the environment. The Environmental Protection Agency sent revised greenhouse gas regulations to the White House this past week, but 51% of voters think all EPA regulations should require congressional approval before they can be implemented. That’s up eight points from 43% in late December.

Speaking of the Executive branch, 52% of voters think it is good for the nation that the American people know more about the National Security Agency’s domestic surveillance programs, and 70% believe the phone and e-mail surveillance programs may have inappropriately violated the privacy of innocent Americans. But as for Edward Snowden, the whistleblower who disclosed the programs, just eight percent (8%) regard him as a hero. Thirty-two percent 32% now consider him a traitor. Most think he’s either somewhere between hero and traitor (34%) or that it’s too early to tell (23%). Those numbers reflect a slightly more negative view of Snowden compared to two weeks ago.

Forty-three percent (43%) rate the NSA’s secret surveillance of Americans’ phone and e-mail communications as a serious scandal. Thirty percent (30%) view it as an embarrassing situation but not a scandal, while 20% say it’s not big deal.

The White House announced late Tuesday that it will delay implementation of the employer mandate, a key portion of President Obama’s national health care law. As the date for implementation draws near, support for state implementation of the health care law is declining. Just 41% of voters now would like their governor to support implementation of the law, while 48% want their governor to oppose it. This marks a complete reversal from January when 47% wanted their governor to support implementation of the law and 39% were opposed.

As projected by the Rasmussen Employment Index, the number of new jobs in June – announced by the government yesterday – slipped slightly from the month before but remains near the highest levels of recent years. Sixty-six percent (66%) of Americans know someone who is out of work and looking for a job. But that’s the lowest it’s been since the Great Recession. Americans are evenly divided as to whether the labor market is better than it was a year ago and also divided as to whether the unemployment rate will be better or worse a year from now.

An increasing number (35%) of Americans think now is a good time to sell a home in their area, but belief that owning a home is a family’s best investment is down to 47%, the lowest level measured in the past year.

Consumer and investor confidence remain near recent highs.

Despite these signs of economic improvement, confidence in the president’s handling of the economy continues to fall. Thirty-five percent (35%) of voters now say Obama is doing a good or excellent job in this area, down from a recent high of 48% in December.

The president had a challenging month of June, and his total Job Approval rating fell two points to 47%. That ties his lowest approval rating since December 2011. The current rating is more in line with his job approval during most of his first term in office.

In other surveys this week:

– For the second week in a row, 30% of voters say the country is heading in the right direction.

Americans consider blacks more likely to be racist than whites and Hispanics.

– Americans continue to strongly believe private sector workers work harder for less money and have less job security than government workers do.

– Only 26% of voters rate the performance of public schools in America today as good or excellent. Sixty-one percent (61%) think most textbooks are chiefly concerned with presenting information in a politically correct manner, little changed from surveys for the past three years. 

– Just 25% think most high school graduates have the skills needed for college. Only 22% think high school graduates have the necessary skills to enter the workforce.

– Americans continue to see the Fourth of July as one of the nation’s most important holidays. It ranks second only to Christmas.

– Most Americans are aware that the Fourth of July celebrates the signing of the Declaration of Independence.

Remember, if it’s in the news, it’s in our polls.

June 29 2013




N.S.A. Leak Puts Focus on System Administrators

NY Times


Published: June 23, 2013

As the N.S.A., some companies and the city of San Francisco have learned, information technology administrators, who are vital to keeping the system running and often have access to everything, are in the perfect position if they want to leak sensitive information or blackmail higher-level officials.

“The difficulty comes in an environment where computer networks need to work all the time,” said Christopher P. Simkins, a former Justice Department lawyer whose firm advises companies, including military contractors, on insider threats.

The director of the N.S.A., Gen. Keith B. Alexander, acknowledged the problem in a television interview on Sunday and said his agency would institute “a two-man rule” that would limit the ability of each of its 1,000 system administrators to gain unfettered access to the entire system. The rule, which would require a second check on each attempt to access sensitive information, is already in place in some intelligence agencies. It is a concept borrowed from the field of cryptography, where, in effect, two sets of keys are required to unlock a safe.

From government agencies to corporate America, there is a renewed emphasis on thwarting the rogue I.T. employee. Such in-house breaches are relatively rare, but the N.S.A. leaks have prompted assessments of the best precautions businesses and government can take, from added checks and balances to increased scrutiny during hiring.

“The scariest threat is the systems administrator,” said Eric Chiu, president of Hytrust, a computer security company. “The system administrator has godlike access to systems they manage.”

Asked Sunday about General Alexander’s two-man rule, Dale W. Meyerrose, a former chief information officer for the director of national intelligence, said, “I think what he’s doing is reasonable.”

“There are all kinds of things in life that have two-man rules,” added Mr. Meyerrose, who now runs a business consulting firm. “We’ve had a two-man rule ever since we had nuclear weapons. And when somebody repairs an airplane, an engineer has to check it.”

John R. Schindler, a former N.S.A. counterintelligence officer who now teaches at the Naval War College, agreed that the “buddy system” would help. “But I just don’t see it as a particularly good long-term solution,” he said.


“Wouldn’t it be easier to scrub all your I.T.’s for security issues,” he asked, “and see if there is another Snowden?”

The two-man rule “has existed in other areas of the intelligence community for certain exceptionally sensitive programs where high risk was involved,” he said, “but it’s not a standard procedure.”

Mr. Meyerrose and Mr. Schindler both said that software monitoring systems can also help, though they can be evaded by a knowledgeable systems administrator. The biggest issue for government and industry, they said, is to vet the I.T. candidates more carefully and to watch for any signs of disillusionment after they are hired.

“It’s really a personal reliability issue,” Mr. Meyerrose said.

Insiders of all types going rogue have become a problem for the government and industry over the last decade. One of the most prominent is Pfc. Bradley Manning, who downloaded a vast archive of American military and diplomatic materials from his post in Iraq and gave it to WikiLeaks. But there have been others, including scientists and software developers who stole secrets from American companies where they worked and provided them to China.

Now the spotlight is on the system administrators, who are often the technology workers with the most intimate knowledge of what is moving through their employers’ computer networks.

Beyond their store of technical manuals to keep the system running, administrators at intelligence agencies can have access to specific top secret programs without necessarily being cleared for them, like other intelligence agents must be.

If they can get into one part of the network with credentials for what is called “root access,” they can get into almost everything else. They are known as the “super user.”

Since 9/11, the vast majority I.T. experts in the intelligence world have worked for private contractors, and the Snowden case has set off a new debate about whether the government could have more control of the workers if they were direct employees.

“This is a dirty little secret that’s being revealed,” said Robert Bigman, a former chief information security officer at the Central Intelligence Agency. “When you log on with a root account, it doesn’t know if you’re staff employee or a contract employee. It just knows you’re root. You’re known as a superuser. You have all privileges.”

At a New Jersey pharmaceutical firm in early 2011, a former I.T. administrator gained access to the company’s system, deleted several files — including those that tracked customer orders — and froze the firm’s operations for several days, causing $800,000 in damages. Prosecutors called it a revenge attack after the company, Shionogi, announced layoffs. The administrator, Jason Cornish, pleaded guilty in August 2011.

And in 2008, a network administrator for the city of San Francisco named Terry Childs found out that he was about to be laid off and modified the city’s network in such a way that only he held the password. He refused to hand it over for 12 days, effectively disabling everything from the mayor’s e-mail to the city’s payroll records.

Reuters has reported that Mr. Snowden had made many posts anonymously on an online forum, including one in 2010 in which he seemed critical of technology companies cooperating with government surveillance programs.

Mr. Schindler, the former N.S.A. counterintelligence officer, said that while a person’s political views are not considered in terms of security clearances, the reviews may need to be expanded to include Twitter posts and other online comments that could yield clues to a job candidate’s thinking.

He said the N.S.A. could also do what Soviet officials did after one of their cipher clerks defected in 1945.

“Their response wasn’t to crack down on code clerks, but to make them happier,” Mr. Schindler said. “They boosted their pay and gave them more reasonable hours, and they got no-knock privileges with the boss to keep them happy.”

Mr. Simkins, the former Justice Department counterespionage lawyer, said that it is “more difficult than it sounds” to address threats posed by rogue insiders.

“At the end of day, there’s no way to stop an insider if the insider is intent on doing something wrong,” he said. “It’s all about mitigating.”


U.S. worried about security of files Snowden is thought to have

Washington Post

By Ellen Nakashima and Greg Miller, Published: June 24

The ability of contractor-turned-fugitive Edward Snowden to evade arrest is raising new concerns among U.S. officials about the security of top-secret documents he is believed to have in his possession — and about the possibility that he could willingly share them with those who assist his escape.

It’s unclear whether officials in Hong Kong or in Russia, where Snowden fled over the weekend, obtained any of the classified material. A spokesman for the anti-secrecy group WikiLeaks, which has been assisting the former National Security Agency contractor, strenuously denied reports that foreign governments had made copies of the documents.

“This rumor that is being spread is a fabrication and just plays into the propaganda by the administration here that somehow Mr. Snowden is cooperating with Russian or Chinese authorities,” spokesman Kristinn Hrafnsson said in a phone interview Monday.

Nonetheless, in 2010 and 2011, WikiLeaks published hundreds of thousands of sensitive U.S. documents it obtained from Army Pfc. Bradley Manning, and co-founder Julian Assange suggested in a teleconference call with reporters Monday that the group was interested in gaining access to the documents Snowden had obtained.

“In relation to publishing such material, of course WikiLeaks is in the business of publishing documents that are supposed to be suppressed,” Assange said. He declined to say whether Snowden had shared any of the material.

The NSA has teams of analysts scouring systems that they think Snowden may have accessed, officials said. Analysts are seeking to retrace his steps online and to assemble a catalogue of the material he may have taken.

“They think he copied so much stuff — that almost everything that place does, he has,” said one former government official, referring to the NSA, where Snowden worked as a contractor for Booz Allen Hamilton while in the NSA’s Hawaii facility. “Everyone’s nervous about what the next thing will be, what will be exposed.”

Glenn Greenwald, the Guardian columnist who has published a series of stories based on documents provided by Snowden, said he has exercised discretion in choosing what to disclose. Snowden, too, has said he was selective in choosing what to disclose.

“I know that he has in his possession thousands of documents, which, if published, would impose crippling damage on the United States’ surveillance capabilities and systems around the world,” Greenwald told CNN. “He has never done any of that.”

The Guardian, Greenwald said on “Meet the Press” on Sunday, has withheld “the majority of things that he gave us pursuant not only to his instruction, but to our duty as journalists.”

Over the past several weeks, The Washington Post and the Guardian have published articles and portions of documents that describe two major surveillance programs. One, called PRISM, deals with the interception of e-mail and other Internet content of foreign terrorism suspects thought to be located overseas. The other involves the amassing of a database of Americans’ phone call records — numbers dialed and received, length of call, but no content — which can be searched for a specific phone number when there is “reasonable, articulable” suspicion of a terrorist plot or activity associated with the number.

A former senior U.S. official said that the material that has leaked publicly would be of limited use to China or Russia but that if Snowden also stole files that outline U.S. cyber-penetration efforts, the damage of any disclosure would be multiplied. The official, like others in this article, spoke on the condition of anonymity because the official wasn’t authorized to discuss the matters on the record.

U.S. officials said their assumption is that China and Russia have copied the materials that Snowden took from classified U.S. networks but that they had no way to confirm those countries had done so.

“That stuff is gone,” said a former senior U.S. intelligence official who served in Russia. “I guarantee the Chinese intelligence service got their hands on that right away. If they imaged the hard drives and then returned them to him, well, then the Russians have that stuff now.”

Hrafnsson said such assertions are unfounded. In his call with reporters, Assange insisted that neither the Chinese nor the Russians had debriefed Snowden during his stops in their territories.

Asked about the security of the trove of material thought to be on Snowden’s laptop and the possibility that it could fall into the wrong hands, Assange said: “Mr. Snowden’s material has been secured by the relevant journalist organizations prior to travel.” Asked if he could elaborate, he said, “I’m afraid I cannot.”

The damage assessment being conducted by U.S. officials is expected to take “a few months, at best,” said a senior intelligence official. “We’re looking for all of the information that was disclosed, and assessing the damage it may have caused in terms of national security sources,” he said.

A second senior intelligence official said there were concerns that disclosure of U.S. surveillance methods would make it easier for terrorist groups to avoid detection. “The more material that gets made public the more capability we lose,” the official said.

Already, several terrorist groups in various regions of the world have begun to change their method of communication based on disclosures of surveillance programs in the media, the official said. He would not elaborate on the communication modes.

“It’s frustrating,” he said. “Because if they find some other method to communicate, we go dark. And we miss dots. That’s not something we’re particularly excited about.”

Anthony Faiola in London and Philip Rucker and Julie Tate in Washington contributed to this report.


How Edward Snowden’s encrypted insurance file might work


By Zachary M. Seward @zseward June 26, 2013

Now we have a bit more clarity on what Edward Snowden meant last week when he said, “The US government is not going to be able to cover this up by jailing or murdering me. Truth is coming, and it cannot be stopped.”

It turns out Snowden has given copies of the files he purloined from the US National Security Agency, his former employer, to “many different people around the world,” according to Glenn Greenwald, the Guardian journalist who first published some of the materials provided by Snowden. But the files are encrypted, so the people who have the documents can’t read them. “If anything happens at all to Edward Snowden, he told me he has arranged for them to get access to the full archives,” Greenwald told the Daily Beast.

How might that work? Snowden could be using any of a number of complicated cryptographic gambits.

Cryptography is a gatekeeper. It allows us to check our bank accounts, sign into email, and browse Facebook without worrying that any of that data can be intercepted by others (the NSA surveillance revealed by Snowden notwithstanding).

The simplest form of cryptography that Snowden and his allies could be employing would involve one person keeping an encrypted copy of the files and someone else holding the key necessary to decrypt it. But that method is vulnerable, relying on the trustworthiness of the person who has the key, and it doesn’t sound like what Snowden has done.

More complex, more secure, and more interesting would be a form of “secret sharing.” Essentially, the files can only be unlocked if each member of a group shares his portion of the encrypted information; or, alternatively, if several people are given encrypted portions and a combination of, say, any three of them is sufficient to unlock the files. For instance, this illustration represents the data shared by three people as intersecting planes; the point where they intersect represents the secret that is unlocked when they’re shared:

It’s obviously more complicated than that, and there are other methods Snowden could be using. How well he knows advanced cryptography also isn’t clear, but he’s receiving assistance from Wikileaks, which is skilled in it. Wikileaks has also used its own form of “insurance files” in the past.

Snowden remains in diplomatic limbo in the transit area of Sheremetyevo airport in Moscow. As to what his insurance file contains, beyond what has already been revealed by the Guardian, Washington Post, and South China Morning Post, we have yet to find out; Greenwald told the Daily Beast that just the non-encrypted material Snowden gave him consists of “thousands of documents.”


Dozens of Security Clearance Reports Falsified

Office of Personnel Management IG Investigates Top Contractor

By Eric Chabrow, June 24, 2013. Follow Eric @GovInfoSecurity

The federal government has identified dozens of cases of alleged falsification of reports submitted by investigators examining individuals being considered for security clearances.

In testimony at a Senate hearing June 20, the inspector general of the Office of Personnel Management said there are 20 cases in which investigators – federal employees and contractors – were either found guilty or were about to plead guilty to falsifying security clearance reports. The office also is investigating dozens more cases, he said.

The hearing was called in the wake of revelations that former National Security Agency systems administrator Edward Snowden, who worked as a contractor for Booz Allen Hamilton, leaked information about top secret intelligence-gathering programs.

In a criminal complaint dated June 14, federal authorities have charged Snowden with theft of government property, unauthorized communications of national defense information and willful communications of classified communications intelligence information to an unauthorized person. News reports on June 24 say Snowden left Hong Kong over the weekend for Moscow and could be heading to Ecuador, where he would seek political asylum.

At that joint hearing of two subcommittees of the Homeland Security and Governmental Affairs Committee, OPM Inspector General Patrick McFarland confirmed the IG is conducting a criminal investigation of USIS, the largest contractor that conducts security-clearance investigations for the government.

Sen. Claire McCaskill, the Missouri Democrat who co-chaired the hearing, said the government knows Snowden received his clearance from USIS. “We know that their investigation encompasses the time he received his clearance,” said McCaskill, chair of the Subcommittee on Financial & Contracting Oversight. “We need answers.”

McFarland declined to provide details at the hearing on the IG’s investigation of USIS.

USIS Unaware of Criminal Probe

In a statement issued after the hearing, USIS said it was never informed by the inspector general that the company was under a criminal investigation, adding that it complied with a January 2012 IG subpoena for records and has cooperated fully with the government’s civil investigation.

USIS said in the statement: “Questions were raised as to whether USIS had conducted the initial background investigation, or a periodic reinvestigation, for the security clearance of Edward Snowden. USIS conducts thousands of background investigations annually for OPM and other government agencies. These investigations are confidential, and USIS does not comment on them.”

Susan Ruge, the IG’s associate counsel, told the Washington Post that USIS’s review of Snowden’s security clearance ended in April 2011 and that the IG began its probe of the firm later that year.

No Margin for Error

As of last year, about 3.5 million federal employees and 1.1 million government contractors held top security clearances. “Given the increasing number of folks with access to that information, we have a real problem on our hands if we can’t get this right,” said Sen. Jon Tester, D-Mont., who co-chaired the hearing with McCaskill. “There is no margin for error.”

Most individuals holding top security clearances don’t have the type of access to computer files that Snowden had. For example, military pilots who transport sensitive materials must gain security clearances.

Vetting individuals for top-secret security clearance is costly, topping $4,000 for each instance, according to the Government Accountability Office. A statement issued by McCaskill and Tester said OPM’s security clearance and background investigations cost the federal government about $1 billion in 2012; that annual cost is expected to rise to $1.2 billion by 2014. They said OPM spends 46 percent of its funds on the contractors who perform investigations, adding that about 75 percent of all field investigators are contractors, an estimated 4,600 out of 6,200 in total.

Falsifications Uncovered

McFarland, in his testimony, told the panel that the IG office had uncovered a number of falsifications of security-clearance investigation reports by federal employees and contractors who reported interviews that never occurred, recorded answers to questions that were never asked and documented records checks that were never conducted.

Since 2006, McFarland said, 18 background investigators and record searchers have been convicted of falsifying records, with a 19th investigator pleading guilty last month and a 20th investigator expected to plead guilty this month. The IG said his office is engaged in fabrication cases against nine other background investigators, with another 36 cases pending.

Of the 18 investigators convicted, 11 were federal employees and seven were contractors. Because of a lack of federal funding, McFarland said the IG’s priority is to investigate federal employees, so these 18 cases do not necessarily reflect the rate of falsification based on employers. “If that background investigation is not thorough, accurate and reliable, then all other decisions made related to the issuance of the security clearance are suspect,” he said.

The decision to grant a security clearance is made by government-employed adjudicators and not the background investigators.

Antiquated System Hindering Process

The way the federal government investigates security-clearance candidates makes the process ripe for possible falsification by some investigators, says Evan Lesser, who has closely followed the government’s security-clearance apparatus since co-founding the employment site in 2002.

Lesser, managing director, characterizes investigators more as data collectors because they’re required to stay close to an OPM script their given in their questioning. He says investigators go into the field with pencil and paper and not mobile computers or smart phones. The responses they receive are entered into an ancient DOS-based computer system, he says.

“If you look at some of the cases that have been brought against investigators who had falsified information, you often heard that they were under the gun and moving quickly in trying to get cases completed,” Lesser tells Information Security Media Group.

Background investigators are inundated with new guidance and regulations, he says. “Like any group of people, you’ve got your bad apples. They’re definitely a good group of people, no more or less patriotic than any other government employee or contractor. But their situation is somewhat unique by the fact that they do deal with antiquated technology, they do have high work loads and they’re pressured to get things done quickly.”

Age a Possible Factor

Lesser says the 30-year-old Snowden’s age may have played a significant role with him getting top-security clearance.

“The younger the clearance applicant, the less financial history they have, the less foreign travel and foreign connections they have,” Lesser says. “For older people, there’s a lot more data. For younger people, there’s just not whole out there.

“It’s entirely possible that the USIS investigator who talked to Edward Snowden did his or her job 100 percent perfectly and by the book [but] they weren’t able to get a whole lot of information about and from Mr. Snowden.”


U.S. energy companies seen at risk from cyber attacks: CFR report

WASHINGTON | Wed Jun 26, 2013 12:11am EDT


(Reuters) – U.S. oil and natural gas operations are increasingly vulnerable to cyber attacks that can harm the competitiveness of energy companies or lead to costly outages at pipelines, refineries or drilling platforms, a report said on Wednesday.

The energy business, including oil and gas producers, was hit by more targeted malware attacks from April to September last year than any other industry, said the Council on Foreign Relations (CFR) report, citing data from a Houston-based security company, Alert Logic.

Cyber attacks on energy companies, which are increasing in frequency and sophistication, take two main forms, the CFR report said. The first kind, cyber espionage, is carried out by foreign intelligence and defense agencies, organized crime, or freelance hackers.

These parties covertly capture sensitive corporate data or communications with the goal of gathering commercial or national security intelligence. U.S. energy companies are subject to frequent and often successful attempts by competitors and foreign governments to access long-term strategic plans, bids tendered for new drilling acreage, talks with foreign officials and other trade secrets, the report said.

A campaign against U.S. energy companies by hackers based in China, called Night Dragon by McAfee, a leading security company that is part of Intel Corp, began in 2008 and lasted into 2011. The campaign stole gigabytes of material, including bidding data in advance of a lease auction. One unidentified energy company official believes his company lost a bid in a lease auction because of the attack, the CFR report said.

Many companies are either unaware of similar attacks or are afraid to disclose them for fear of upsetting investors, it said.

“That’s too bad because it makes it harder for Washington to help them and it also makes it harder for the public to be aware of what threats are out there,” said Blake Clayton, a fellow in energy and national security at CFR and a co-author of the report.

The second main cyber risk to energy companies is the disruption of critical businesses or physical operations through attacks on networks.

“This has a lower probability but potentially higher cost,” said Clayton.

The Stuxnet virus, said to have been created by the United States and Israel to attack Iran’s nuclear program, is an example of a campaign that ended up escaping from its intended target at the risk of causing harm to a U.S. company. Chevron Corp said late last year it had been infected by Stuxnet, but said without elaborating the virus was quickly controlled.

An attack dubbed Shamoon last year on Saudi Aramco, Riyadh’s state oil company, ultimately disabled some 30,000 computers. The company said the attack was aimed at stopping oil and gas output at the biggest OPEC crude exporter.

Oil production was apparently unaffected, but damage could have been more severe had the attack penetrated further into the network, the report said.

Hackers from a group called “Cutting Sword of Justice,” suspected to be insiders, claimed responsibility for the attack, which was believed to have been delivered using a USB drive.


(Reporting by Timothy Gardner; Editing by Matt Driskill)


Drones evolve into a new tool for ag


Laurie Bedord 05/10/2013 @ 7:21am

Loss of pilots over hostile territory fueled the birth of the drone in 1918. Nearly a century later, the mention of the word evokes a negative connotation.

“I think the first thing you imagine when you hear the word drone is this image of a predator operating in Afghanistan with an assault weapon or missile strapped underneath,” says Rory Paul, of Volt Aerial Robotics.

Yet, these devices, which Paul refers to as unmanned aerial systems (UAS), have the potential to be more than just spies in the sky. A passionate advocate for the use of this technology in agriculture, Paul has been working to change that image from foe to friend.

“UAS are something more intelligent than just a target drone that is shot down by antiaircraft artillery,” he notes.

Though he’s been working to introduce the concept to agriculture since 2006, it’s only recently that the tides have turned.

“It is an area that has been ignored, but there’s a change in the focus that has happened in the last few months,” says Paul. “The big defense players are slowly turning their eyes toward the potential these devices hold for ag applications.”

The greatest possibilities, he believes, are in aerial imagery and data acquisition. He also thinks size won’t matter when reaping the benefits. “These systems will be operated by small and large farmers to acquire data when they want it,” he says.

Agent for change             

Despite the potential value UAS bring to agriculture, there are still challenges to navigate. Their use falls under the jurisdiction of the Federal Aviation Administration (FAA). It is in the process of developing rules and a plan for commercial use of UAS in national airspace by 2015, which is currently strictly prohibited.

As both a full-scale, instrument-rated private pilot and a model airplane enthusiast, Roger Brining has been flying model remote-controlled (RC) aircraft for recreational use under the rules and safety guidelines of the Academy of Model Aeronautics (AMA) since the 1970s.   

“Model aircraft have successfully and safely coexisted with full-scale planes for years,” he says. “The FAA has worked with the AMA for decades to ensure that safety is maintained. The catch is that all of these guidelines and safety programs specifically exclude any commercial use of RC aircraft. Once we get into a company charging a farmer to take aerial imagery, this becomes a commercial use.”

There is no question there will be FAA regulations on what can and cannot be done with sUAS,” says Brining, who farms in Great Bend, Kansas. “This is a must for their safe coexistence with manned aircraft.”

Gathering information              

Universities also see this tool as an ally for many in the industry.

“UAS can reduce equipment wear and tear as well as labor and fuel costs to get highly precise data you wouldn’t necessarily be able to gather going through the field at ground level,” says Kevin Price, professor of agronomy and geography at Kansas State University.

For over a year, Price, along with Deon van der Merwe, an associate professor at Kansas State’s college of veterinary medicine, have been collaborating to explore how unmanned technology can play a role in ag missions.

They have uncovered a wide range of uses with the help of two units: a RiteWing Zephyr II and a DJI S800 Spreading Wings hexacopter.

For example, they’re working with professors who do crop breeding with literally thousands of plant crosses. Seeds are planted in patches, and larger fields are filled with thousands of patches.

“Professors walk the field looking at each patch and its phenology, which is the way the plant looks – its height and shape,” Price says. This information is then used to estimate yields.

“Every patch is harvested, and seeds of the different varieties are weighed,” he continues. “It can take up to 1,500 hours of labor to get one phenotype, which is a bottleneck for moving the genetic breeding program along.”

To speed up the process, he looked to a spectroradiometer, which measures the energy coming off the plants in thousands of wavelengths.

“With two wavelengths – red and near-infrared – we can explain over 80% of the variability in yields on these thousands of phenotypes. If we can take those two wavelengths and build them into a camera, we can fly a field, take an image, and project the yield on every plot in minutes. We can ignore the bad plots and not have to collect that data. It’s going to save millions of dollars in research time.”

Beyond the crop, he can see UAS counting cattle, checking for water in the pond, or determining if blue-green algae that can kill livestock is present.

Other jobs that once took hours, if not days, are reduced to minutes. “We mapped an area of about 640 acres in 18 minutes,” he says.

The camera system they have, which is a Canon s100 converted to color infrared, takes a picture every four seconds.

“This provides us a lot of coverage, and the more coverage, the better,” he says.

Agisoft, a Russian software program, splices together the hundreds of images taken to create a mosaic. All of the fields over a 640-acre area are then pulled together into one large image at 1-inch resolution.

“I’m looking at individual plant leaves now,” Price says. “Images are going to get even better in the near future with the new camera systems coming out.”

Rugged business

Finding a device rugged enough to take some abuse and to not cause people to lose valuable equipment like cameras is another area his team is looking at.

The RiteWing Zephyr II is made of expanded polypropylene, which is high-grade engineering foam that has an elastic nature, allowing it to regain its shape.

“It is rugged and flies stable,” Price notes. “It also has fewer parts, which means less breakage.”

On a typical aircraft, the rudder and movable flaps called ailerons on the rear edge of each wing make a plane turn to the left or right. Moving flaps called elevators on the tail make the nose of the plane go up or down.

“The RiteWing Zephyr II has elevons, which combine the functions of the elevator and the aileron,” Price explains.

Besides fewer moving parts, other advantages include less mass, less cost, less draft, and faster control response.

To date, Price says they have spent about $25,000 on their equipment. However, through trial and error, he believes they could build a unit for less than $2,000.

“We tell farmers to expect to pay around $5,000, because they have to buy a radio, which is about $400,” he notes.

Taking flight

As the new owner of a RiteWing Zephyr XL (similar to the Zephyr II but with an 81-inch wing span), Brining has tried several forms of aerial imagery in the past.  

“I have used custom-flown, traditional aerial imagery,” he says. “The resolution was poor. The costs were extremely high (around $3.50 per acre per flight), and they didn’t get the flights accomplished in a timely enough fashion to meet my objectives.”

What intrigued him about this technology is its incredible flexibility, speed, and low cost for flights combined with a very high-resolution final product.

“I think the new system will let me make tactical decisions, which have all been done strictly based on ground scouting and sampling,” Brining adds.

He estimates the entire system will cost $5,000 to $7,000.

In the first year, he wants to get the system operational, learn to process images, and use the flights as a tool so his agronomists can make better use of their time by scouting the right parts of the field.

“I think it will also be extremely helpful in locating leaks in my subsurface drip-irrigation systems,” he adds.

Crunching the numbers

According to The Economic Impact of Unmanned Aircraft Systems Integration in the United States, precision agriculture is one of the markets with the largest potential for this technology. The document, which was released by the Association for Unmanned Vehicle Systems International (AUVSI), estimates that in just two years the economic and employment impacts of ag spending in all 50 states will be nearly $2.1 billion and creating more than 21,500 jobs.

UAS are tools to apply precision agriculture techniques,” says Chris Mailey, vice president of Knowledge Resources for AUVSI. “They are a key piece in the future of precision agriculture.”

However, he cautions, realizing those gains will be tied to certain factors.

“State incentives, like a tax incentive or a test site, may bring more jobs to one state vs. another,” he notes. “Legislation – positively and negatively – may affect the adoption of these devices in the short term. I believe that once any technology – no matter what it is – is proven out, it is much easier to get larger adoption.”

Every year that sales of the technology are delayed, the report says the U.S. stands to lose $10 billion.

Safety first

With more than 18,000 airports, 750 air traffic control facilities, and 4,500 air navigation facilities, national airspace is a huge system to manage. There are more than 238,000 general aviation aircraft in the system at any time. Maintaining safety in the sky is the FAA’s top mission.  

Mailey says it should be the mission of the ag industry, as well.

“Safety is going to be paramount,” he says. “UAS are just like a tractor or a truck. If you take your eyes off of it or do something not very smart, it can be a very, very dangerous device.”

The hybrid Samsung Ativ Q laptop has Windows 8 and Android Jelly Bean (hands-on)

  • presented by

CNET Editors’ Take

June 20, 2013 12:00 PM PDT

LONDON — It’s a hybrid device, transforming from a Windows 8 slate into a full-fledged laptop in one quick motion. That’s not all, though — at the tap of an icon it’ll boot into Android Jelly Bean, letting you swipe around all of your favorite apps from the Google Play store.

It’s packing the latest Intel Haswell Core i5 processor, with a 13-inch display boasting a whopping 3,200×1,800-pixel resolution. Samsung has yet to announce pricing or availability, but make sure to keep checking CNET for all the latest news.

The Ativ Q’s standard form is a rather chunky 13-inch slate. Lift the display up at the back, though, and it reveals a keyboard underneath. The screen folds up and locks into place behind the keys, allowing you to type and swipe in much the same way you would on any other touch-screen laptop.

It’s the same converting technique we saw on Sony’s Vaio Duo, but Samsung has taken it a step further. Rather than converting just between laptop and slate, the screen can lie flat, lifted above and parallel to the keyboard. I’m not entirely sure why you’d want to keep it in that position — rather than have it lying flat in tablet mode — but I guess it’s useful to have the option.

More helpful though is the option to flip the screen all the way over, turning it, essentially, into a tablet on a stand. The screen automatically rotates as you turn it over. Samsung reckons this mode is best to use for presentations in meetings or just for watching video when you don’t want the keyboard exposed.

Samsung also includes its S Pen stylus for handwritten notes or doodling when you’re bored. It’s the same smaller variety you’ll get on the Galaxy Note 10.1, meaning it’s not particularly comfortable to hold, but does at least slot neatly into the base.

The converting motion seemed fairly smooth in my brief hands-on time, although I worry that over time the small hinges might not be able to put up with much punishment. That’s particularly important, as the rear stand is actually where the CPU is housed. While that’s an interesting space-saving design, it potentially puts the delicate components at more risk from knocks and bumps. We’ll give it a full stress test when we get it in for review.

Samsung’s Ativ-apalooza

It’s a 13-inch machine with physical proportions roughly the same as any other 13-inch ultrabook’s. It’s 14 millimeters thick, which is satisfyingly skinny, it’s small enough to slide easily into most backpacks, and at around 1.3kg (2.8 pounds), you won’t struggle to carry it around for long periods either.

The keyboard’s keys are quite small, so it might not suit those of you with hands the size of continents. There’s no touch pad like you’d find on a normal laptop either. You’ll have to make do with the little trackpoint in the middle of the keys, or just use the touch screen.

The 13.3-inch display boasts an astonishing 3,200×1,800-pixel resolution, which is the highest pixel count we’ve seen on a 13-inch machine. It gives it a pixel density of 275 pixels per inch — better than the 9.7-inch Retina iPad‘s 264ppi, and a massive amount for such a big display.

Unsurprisingly then, everything looked absolutely pin-sharp. It’s bright, too, and colors seemed rich and vivid. I wasn’t able to spend a lot of time with the screen, so I’ll leave my final judgment for the proper review, but suffice to say I’m excited to see my own high-resolution photos on it.

Windows 8 and Android Jelly Bean
Like James Bond’s gadget guru, Samsung’s own Q has a trick up its sleeve. It’s first and foremost a Windows 8 laptop, but at the tap of a tile, it can boot into Android Jelly Bean. At first glance this might seem a bit of a gimmick, but when you take a moment to think about it, it actually makes a lot of sense.

If you’re struggling to decide whether to splash your cash on a Windows laptop, a Windows tablet, or an Android slate, Samsung might have just the solution in the Ativ Q.

You can work in Windows 8 as on a regular laptop, taking advantage of the familiar Windows layout and Office tools. When the busywork is done, switch over to Android to play touch-screen games and casually swipe around any of the hundreds of thousands of apps from the Google Play store. Even if that doesn’t appeal, the sheer number of additional Android apps goes a fair distance to make up for the lack of good titles in the Windows Marketplace.

The Android portion runs on the same Intel Core i5 processor as the Windows portion, so performance should be satisfyingly zippy. The chip is the latest Haswell silicon from Intel, which promises better battery life than its predecessor — again, I’ll test this properly in the review.

Switching between the two operating systems is a simple task of hitting a tile on the Windows 8 desktop. The switch takes a couple of seconds at most and thankfully doesn’t require a restart. Going back to Windows is simply a matter of hitting the Windows home key beneath the display. The operating systems also share file folders, meaning anything you save in your gallery in Android will be available in your pictures folder when you return to Windows.

Interestingly, Samsung hasn’t tried to apply any of its usual software additions over the top of Android. Instead, you get the pure vanilla Jelly Bean experience. It’s the full-fat version of Windows onboard, too, rather than the hobbled Windows RT version, which was specifically designed for tablets. That means you’re able to install any normal desktop software.

You can’t blame the Ativ Q if it struggles to understand itself. It’s at once a Windows tablet, an Android tablet, and a Windows laptop. Samsung evidently hopes this combination will be the perfect solution for people who want a device to suit both work and play.

Whether that’s really the case or if it’s actually a confusing hash of products that should remain separate entities remains to be seen. I’m certainly looking forward to finding out which it is, though.


B-1 school sees pilots losing currency during sequestration

by Airman 1st Class Peter Thompson

7th Bomb Wing Public Affairs


6/24/2013 – DYESS AIR FORCE BASE, Texas — A unit responsible for producing elite B-1 Lancer weapons officers has had its budget cut nearly in half due to sequestration.

The 77th Weapons School at Dyess Air Force Base, Texas, took this as a challenge and opportunity to maximize efficiency.

“We have taken notice of the mentality that we need to have at this time,” said Lt. Col. Brian Vlaun, 77th WPS director of operations. “Efficiency is closely linked to a call to find innovative solutions, which applies to us by maximizing the training we have in our syllabus.”

Without funding to supply flying hours for the squadron, instructors at the 77th WPS won’t be able to complete their yearly proficiency requirements, which allow them to fly as instructors, aircraft commanders or mission leads.

“Based on the last time each of our instructors flew, their currencies will last them through the middle of July,” said Lt. Col. Jonathan Creer, 77th WPS commander. “Bottom line is we won’t be able to fly.”

The weapons squadron, and U.S. Air Force Weapons School at Nellis Air Force Base, Nev., were forced to cancel a full six-month course, which normally produces a batch of officers trained in applying multiple weapons platforms and strategies to the battlefield.

When the first class of 2013 was cut short in May, the squadron took a day to evaluate their situation and plan steps they would take to maximize efficiency with their almost non-existent budget.

The weapons school has four main objectives to work toward until they regain funding. Their first and primary objective is to update their syllabus.


Normally, the 77th WPS takes the short amount of time between classes to assess their syllabus and make adjustments as needed. This revision process is now the squadron’s main priority.

The first class of fiscal year 2014 will learn from the enhanced legacy syllabus. The second class of the year will be the first at the 77th WPS to train using the B-1 Bomber’s newest upgrade, Sustainment-Block 16, which is the most advanced hardware and software enhancement to date for the bomber.

The squadron’s second objective is to upgrade their tactics, techniques and procedures manuals.

“We will work with the 337th Test and Evaluations Squadron to develop new manuals that will be used to employ the upgraded aircraft,” Creer said.

Their third goal is to simultaneously provide cadre and instructors to support combat operations, and operational testing and training across the B-1 fleet. Instructors have transitioned their efforts from teaching classes to providing their expertise to 9th Bomb Squadron operators as they prepare for future deployments.

“We have the expertise available to help other squadrons with their academics and training,” Vlaun said. “It is our duty to provide for the B-1 community.”

To meet their fourth and final objective, 77th WPS instructors are prepared to continue producing the Air Force’s best weapons officers, and maintain currencies once funding is available.

Looking forward, the 77th WPS’s intent is to create opportunities to grow and learn from their present situation. Their outlook is that even though they have a minimal budget, they are still operational.

“I tell my guys all the time, this is not a throttle back, it’s a change in roles and duties,” Creer said. “We challenge them to do better, exceed their expectations, think critically, receive and give criticism, communicate more effectively and become problem solvers.”


Air Show Turnout is among lowest in history

Lack of military planes, not crash, seen as reason.

Dayton Daily News

By Barrie Barber

June 25, 2013

DAYTON — Attendance at the Vectren Dayton Air Show dropped sharply over the weekend, bringing just 23,000 people through the gates in the lowest turnout in the show’s recent history.

Air Show General Manager Brenda Kerfoot could not say whether the number of spectators was the lowest in the 39 years the show has been in Dayton, but turnout was less than half of last year and less than a third of what it was in 2009 and 2010.

Last year a withering, record-setting heat wave shrank turnout to 47,000 as the Navy’s Blue Angels performed. This year weather wasn’t a big factor, but there was no jet team because of federal budget cuts, and Saturday’s tragic crash may have been a factor in Sunday’s smaller-than-usual crowds.

The show, which often draws 70,000 people or more in a weekend, turned tragic Saturday when stunt performer Jane Wicker, 44, and pilot Charlie Schwenker, 64, were killed after the 1941 Stearman biplane they were in crashed while Wicker was performing her signature aerial wing-walking act near show center at Dayton International Airport.

Kerfoot attributed the sharp attendance decline to the scrubbed appearance of the Air Force Thunderbirds and no other active military aircraft on the grounds or in the sky because of federal budget cuts known as sequestration. The air show fi lled the lineup with well-known civilian acrobatic aerial acts.

“We really think the low numbers are an effect of sequestration (and) that it had nothing to do with the crash,” she said.

The National Transportation Safety Board has launched an investigation into the biplane accident that could take six months to a year. A preliminary report on the crash is expected this week.

John Cudahy, president of the International Council of Air Shows based in Leesburg, Va., said air shows nationwide have had fewer spectators this season because of the no-show of popular military jet teams and solo aircraft demonstrations.

“I think that the lack of military assets at air shows around the country have been a drain on attendance, not just in Day-ton but at a number of shows this year,” he said.

More than 60 air shows with an estimated $400 million economic impact have cancelled because of sequestration, according to Cudahy. The Dayton/ Montgomery County Convention & Visitors Bureau has fi gured the Dayton Air Show has a $3.5 million economic impact with 70,000 visitors.

“The air show is very iconic for our region,” said Jacquelyn Powell, president and CEO of the Convention & Visitors Bureau. “Many people when they think of Dayton, they think of the Dayton Air Show.”

Like Kerfoot, Cudahy does not think the fi ery crash before thousands of people contributed to the fall in attendance.

“Historically, that has not been the case,” he said of air show crashes impacting attendance. “Crashes that have happened in the past, when there was still a day left in the air show,” were not a big factor in attendance.

Kerfoot said Dayton Air Show leaders haven’t reached a decision on the size or scope of the 2014 show with the Blue Angels as the headline act.

The Pentagon has not decided whether military jet teams and planes will › y at air shows next year, but the Blue Angels have booked appearances in 35 places in 2014.

“I think that’s a No. 1 factor we need to determine before we determine what kind of show to do,” Kerfoot said. “I think the (Dayton Air Show) Board (of Trustees) and the community support for an air show is very strong. We have a fund balance to weather bad years, but obviously you can’t keep doing that.”

She did not release how much money the air show has on hand to cover potential losses.

Navy spokesman Lt. John Supple said the Blue Angels are eager to › y in 2014. “We’re hoping the show stays on,” he said. “The Blue Angels are itching to get out there and perform.”

Cudahy said his organization has pushed Congress to let military planes return to air shows. “I think the military recognizes they cannot suspend their interaction with the American public indefi nitely,” he said.

Air show o› cials do not know yet if last weekend’s show made or lost money while accounts continue to be tallied, Kerfoot said. The show spent about $1 million this year, or around a third less than normally budgeted in anticipation of fewer spectators.

Vectren, an Evansville, Ind.-based energy company that has sponsored the show since 2001, and Cincinnati headquartered supermarket giant Kroger, signed three-year deals last year to sponsor the air show through 2015.

“We wanted to be a part of what was at that time a premier event for the Miami Valley and we believe that’s still the case,” Vectren spokeswoman Chase Kelley said Monday. The air show crash and jet team no-shows have not aff ected the sponsorship, she said.

“Those are the exceptions rather than the rule and we still want to be a partner,” Kelley said.



The U.S. Military Aircraft That Flew in Paris


by Brendan McGarry on June 24, 2013


PARIS — Aviation enthusiasts were quick to spot the few American-made military aircraft that did fly at this year’s Paris Air Show, 17-23 June.

No fixed-wing plane currently operated by the U.S. military took to the skies. Drone-maker General Atomics brought a new Predator B, better known by its Air Force designation, MQ-9 Reaper. But the unmanned vehicle remained grounded.

Two other U.S. planes flew, including the World War II-era P-38 Lightning fighter and the C-121 Super Constellation transporter, both made by the predecessor of Lockheed Martin Corp. But those types of propeller-driven craft completed their final military missions decades ago.

The only aircraft in U.S. service today that flew at the event was an export version of the UH-60 Black Hawk helicopter, made by Sikorsky Aircraft Corp., part of Hartford, Conn.-based United Technologies Corp.

The U.S. fighter fleet, including the F-15, F-16, F-18 and F-35, was entirely absent.

The U.S. drastically scaled back its presence at the world’s biggest international air show, as the Defense Department froze spending on such events amid federal budget cuts. The move allowed European arms makers, especially Russia, to take center stage.

Still, Pentagon officials and U.S. company representatives attended the event to capitalize on upcoming opportunities in locations such as Eastern Europe, the Middle East and the Asia-Pacific region.

United Technologies Corp.’s Sikorsky brought the S-70i to the show to market the chopper to potential international customers, especially Poland. The country next year plans to pick a firm to build as many as 70 combat support helicopters in a potential $3 billion deal that’s among the biggest opportunities on the international rotorcraft market.

United Technologies Corp.’s Sikorsky is competing for the order against AgustaWestland, part of Rome-based Finmeccanica SpA, and Eurocopter, part of Leiden, Netherlands-based European Aeronautic Defence & Space Co.

General Atomics plans to sell an unarmed version of its Predator unmanned system to the United Arab Emirates and other countries in the Middle East as part of a plan to boost international sales, a vice president said.

The drone, called the Predator XP, is equipped with radar and sensors to offer wide-area surveillance but not weapon systems such as laser-guided bombs or air-to-ground missiles, according to Christopher Ames, director of international strategic development for General Atomics Aeronautical Systems Inc., based near San Diego.


The company made an effort to display a new Predator B at the show, Ames said. “I’m told we’re one of the only U.S. companies displaying an actual aircraft,” he said. “We worked hard to make it happen.”

The classic planes were also brought to the show by the private sector. The P-38 is actually the restored White Lightnin’ aircraft owned by the Austrian company, Red Bull GmbH, which makes the popular energy drink, Red Bull. The C-121 “Connie” is owned by the luxury Swiss watch maker, Breitling SA.



China’s Fighters, Drone Look Like U.S. Aircraft


by Brendan McGarry on June 20, 2013


PARIS — China’s models of military planes at the Paris Air Show bear resemblance to U.S. aircraft, drawing attention to the rising concern in the Defense Department that the country is using cyber espionage to obtain sensitive defense technology.

The state-run Aviation Industry Corporation of China had a large exhibit of military and civilian models of aircraft at the show, held outside Paris at the historic Le Bourget airfield.

The display included three fighters and a drone: a single-seat version of the FC-1, a single-engine fighter built for the Pakistani air force and designated JF-17; a dual-seat variant of the FC-1 in development; the dual-seat, twin-engine L-15 trainer; and an unmanned system called Wing Loong.

The fighters looked like the F-16 made by Lockheed Martin Corp. and the drone bore resemblance to the MQ-1 Predator made by General Atomics Aeronautical Systems Inc., without the inverted tail. The Chinese drone is designed as a low-altitude craft that can fly up to 16,500 feet and loiter for 20 hours.

A spokesman from the Beijing-based corporation was quick to note that the FC-1 is “a lot cheaper than the F-16,” though he declined to provide a figure. The man gave a brief overview of the systems to Military​.com but declined to be named, citing corporation policy.

Notably missing from the exhibit was any display of the J-20, China’s classified stealth-fighter program.

During talks this month at an estate in Rancho Mirage, Calif., President Barack Obama reportedly warned the new Chinese President Xi Jinping that cyber attacks against the U.S. threaten the two countries’ strategic relationship. Xi insisted China is also the victim of computer hacking.

Obama faced pressure to raise the issue after the recent leak of a classified section of a Defense Department report showed that designs for the most advanced U.S. weapons have been compromised by suspected Chinese hackers. The list of weaponry includes the F-35 Joint Strike Fighter, the Littoral Combat Ship, and the Aegis Ballistic Missile Defense System, among others.

The Pentagon in its latest annual assessment of China’s armed forces for the first time blamed China directly for targeting its computer networks. The attacks were focused on extracting information, including sensitive defense technology.

“In 2012, numerous computer systems around the world, including those owned by the U.S. government, continued to be targeted for intrusions, some of which appear to be attributable directly to the Chinese government and military,” it states. “The accesses and skills required for these intrusions are similar to those necessary to conduct computer network attacks.”

That document also concluded that the People’s Liberation Army, or PLA, considers the strategy of “information dominance” a critical form of defense against countries that it views as “information dependent,” such as the U.S.

China called the accusations “groundless” and “not in line with the efforts made by both sides to strengthen mutual trust and cooperation,” according to a May 9 article published on the state-run website, “People’s Daily Online.” The country is a “victim itself of cyberattacks,” it states.

A Chinese espionage group since 2006 has stolen hundreds of terabytes of information from at least 141 companies across 20 major industries, including aerospace and defense, according to a February report from Mandiant, a closely held company based in Alexandria, Va., which sells information-security services.


Where Has All The Nuke Waste Gone?


 James Conca, Contributor

6/24/2013 @ 11:30AM |1,599 views

A funny thing happened on the way to our high-level nuclear waste dump. Most of America’s high-level nuclear waste is no longer high-level. This is a good thing. Unfortunately, on paper it’s still called high level waste. On legal paper, no less. The bar code says it’s high-level waste so that’s what it is, right?

Wrong. Various processes have changed the nature of this waste over the last 50 years. However, various human laws only consider where it came from and what it was a long time ago, not what it is now.

What happens when human law collides with natural law?

First, it always ends up costing us a lot of money. Second, we always take too long to change the human law, which is why it ends up costing us a lot of money.

So, the bar code says it’s high-level waste. What does that mean?

There are four general categories of nuclear waste in the United States (figure below): commercial spent nuclear fuel (SNF), high-level nuclear waste (HLW) from making weapons, transuranic waste (TRU) also from making weapons, and low-level radioactive waste (LLW) from many things like the mining, medical and energy industries. A minor amount of other radioactive wastes are sprinkled among these categories.

Nuclear and radioactive waste comes in four different flavors, defined in the text, that are treated and disposed of in different ways for very different costs. However, most of the high-level waste (HLW) is no longer high-level, a technical distinction, but one that is worth $200 billion.

SNF is the hottest waste, primarily from two isotopes, Cs-137 and Sr-90, both with approximately 30-year half-lives, making the waste high-level for less than 200 years. Similarly for HLW – it’s the Cs-137 and Sr-90 that make it hot, although not so much as SNF. LLW is not very hot at all. TRU waste spans the gamut from low-level to high-level, and is primarily determined by the amount of plutonium, while the level of hotness is again determined by the amount of Cs-137 and Sr-90.

The details get a little complicated, but HLW refers to waste with high levels of radioactivity that was generated from reprocessing nuclear fuel from weapons reactors to make atomic, and then nuclear, weapons. HLW is defense waste, and much of it is nasty, gooey, watery sludge with the consistency of peanut butter or week-old pudding.  Tricky to handle.

This is very different from commercial spent nuclear fuel (SNF) that has even higher levels of radioactivity but comes from commercial
power reactors and is dry and solid. Easy to handle.

TRU waste is a combination of debris, cements and sludge, a real mish-mash of materials that just has enough plutonium in it to call it TRU, but not enough Cs-137 and Sr-90 in it to call it HLW. Not so tricky to handle.

TRU and HLW tank waste are both generated in the reprocessing of spent fuel from a weapons reactor (not a commercial power reactor), but are differentiated by when in that process they were generated (figure below; definitely click on this one to see the detail). HLW is generated in the early steps of the process that remove the fission products, i.e., the pieces left over when the uranium or plutonium nucleus splits, or fissions, particularly the Cs-137 and Sr-90. TRU is generated in the following steps that separate and concentrate the Pu for weapons, and has much less Cs-137 and Sr-90.

The HLW and the TRU sit at various Department of Energy sites around the country in liquid, sludge and solid forms, but most of the HLW is in those famous old humongous tanks up at the Hanford site that hold 57 million gallons of it, recently made more famous by a few leaks. Not any threat to human health and the environment, but enough to get everyone upset (The First Amendment Bombs Nuclear Energy By Accident).

The strange thing is, we have long ago removed much of the Cs-137 and Sr-90 from these HLW waste tanks, and the rest has been through a couple of half-lives, so there’s not enough Cs-137 and Sr-90 to make it HLW anymore, at least in reality, scientifically.  Instead, most of it is now TRU waste (see figures). But it’s still legally HLW.

High-Level Waste (HLW) and transuranic waste (TRU) in tanks at the Hanford site were originally defined by which step they came from in the reprocessing of weapons fuel to make nuclear weapons. The steps that removed most of the Cs-137 and Sr-90 produced HLW, while the steps that removed most of the Pu produced TRU. However, we have since removed so much Cs-137 and Sr-90 from the HLW, and the rest has decayed away so much, that the HLW tanks now are TRU. Source: the Department of Energy.

What we have here is a classic bureaucratic knot of conflicting definitions that we need to fix in order to stop spending money just treading water.

The President’s Blue Ribbon Commission, which included the new Secretary of Energy Dr. Ernest Moniz, already laid out a plan to correct this mess in a series of well-crafted recommendations (Chris Helman, Forbes; BRC Report to President Obama). First, put SNF in interim storage for decades. This allows it to be separated from the defense HLW. Then the other recommendations can lead us down a reasonable path.

But we have to call the waste what it is. I know that requires a hefty bureaucratic lift since many groups are ideologically wedded to the past. But, come on, please don’t say we can’t even define anything anymore by what it actually is.

The multitude of laws and orders developed over the last 50 years has given us all the language and solutions we need, from the Ronald W. Reagan National Defense Authorization Act to the Nuclear Waste Policy Act, from wording by the House Armed Services Committee to the Nuclear Regulatory Commission’s 10 CFR Part 61, to various DOE Orders. We just have to revisit them and merge them into a coherent set of definitions.

Changing laws and agreements is very difficult these days, but is still a lot easier and cheaper than ignoring reality and treating HLW that is no longer high-level. The cost of physically and chemically treating TRU as though it’s HLW is very expensive and unnecessary. The difference is about $200 billion, a lot of money to spend on a legal technicality.

This is more than an academic exercise since we’re, you know, broke.


What’s the matter with Microsoft?

Experts search for meaning in three Microsoft U-turns

Gregg Keizer

June 25, 2013 (Computerworld)


Microsoft’s had a tough year, and it’s not even half over.

The software giant has retreated from flubs in licensing, the design of its flagship Windows operating system and most recently, innovations it wanted to bake into its new game console, the Xbox One.

On Wednesday, in fact, the highlight of the BUILD developers conference opening keynote will certainly be Windows 8.1, an upgrade that Microsoft casts as a customer feedback-driven refresh, but that some outsiders see as a reversal, even a repudiation, of its first-pass design.

What’s going on? Is the company’s decision-making suddenly fundamentally flawed? As the PC industry goes through its largest-ever slump, is it so desperate that it’s trying to milk revenue wherever it can by forcing change — even when it knows customers will rebel? Has it taken to hauling up the white flag at the first sign of resistance rather than toughing it out, as the old Microsoft might have?

Companies make mistakes all the time, sometimes crippling ones that drag them under. But if the organization is large enough, robust enough, it survives, learns. Ford weathered the Edsel, Coca-Cola New Coke, Netflix its Quikster, Apple the 1985 ousting of Steve Jobs, 2010’s Antennagate and last year’s Maps fiasco. But the pace of Microsoft’s missteps and the resulting turnarounds — three in the span of four months — is unusual.


In March, Microsoft retreated from a sweeping change in its licensing for retail copies of Office 2013, giving way after customers complained that they’d be labeled lawbreakers for trying to move the software from one machine to another. In late May, Microsoft revealed some of the changes slated in Windows 8.1, including the restoration of something very close to the iconic Start button. And last week, Microsoft quickly backed off Xbox One plans that would have nixed sales of used games and required the console to “phone home” daily to Redmond’s servers.


Too focused on money?

Industry analysts and other experts had all kinds of answers for the questions raised by Microsoft’s miscalculations. Some saw a company blinded by a desire to squeeze the last dollar out of customers, or one that thought aping Apple would be a winning strategy. Others faulted it for not anticipating what, in hindsight at least, was guaranteed blowback.

“People don’t like revocations of the physical rights they assign to property, even when we’re talking about software licenses, not software ownership,” said Wes Miller, an analyst with Directions on Microsoft, of the Office 2013 and Xbox One used-game errors. “We have an essence of tangibility, a feeling of ownership, when we buy a floppy disk or buy a CD, or even download a file.”

But by restricting that ownership, and doing that suddenly, Microsoft stepped into a morass when it told customers they couldn’t move Office to a different PC or said that Xbox games could not be resold. It violated that feeling of ownership, which customers interpreted as stealing something rightfully theirs.

“People don’t adjust well to change when that change means less rights and freedom than before,” Miller said, using words that could have been spoken by Boston radicals like Samuel Adams in 1774.

Others echoed Miller on the difficulty of changing behavior and Microsoft’s apparent belief that customers would willingly accept change, as evidenced in statements by Microsoft executives that Windows 8 users would quickly grow comfortable with its far-reaching alterations.


Users don’t like change

“It’s very hard to make monumental business changes in this day and age,” said Peter LaMotte, an analyst with Levick, a Washington, D.C.-based strategic communications consultancy. “People are used to certain functionality, certain interfaces, and it’s very difficult to take those things away from them.”

In other words, Microsoft either didn’t view those rights, implied or not, in the same way as did customers — a failure of one kind — or ignored evidence to the contrary, an error of quite another dimension.

Philip Morton, a senior practitioner in gaming at Foolproof, a U.K. user interface (UI) and user experience (UX) design consultancy, wasn’t sure which it was — though he leaned toward the latter — but he was certain Microsoft screwed up on the Xbox One.


People will accept change, Morton said, if it’s clear there are benefits to doing so that outweigh the burden of the change itself. While that may read as obvious, Microsoft either forgot it or knowing it, plunged ahead anyway.

“Microsoft had a carrot and a stick, but it was all stick and no carrot,” Morton said of the Xbox One plans, which were pitched as a way to simplify sharing games within a family or group of friends, and to make a customer’s game library available from any Xbox console. “Xbox has been successful despite Microsoft, not because of it,” he said. “[With the Xbox One] there was too much Microsoft in the Xbox. Too much of the traditional Microsoft had a say in that decision. They thought more of their business requirements and what the business wants than what the customers want. They didn’t communicate any benefit to the changes, and treated customers like criminals.”

Forcing changes onto customers — not, for instance, giving them an option, as Microsoft has by maintaining traditional “perpetual” licenses even as it pushes Office 365 subscriptions — was the final straw, said Morton.


Blindsided by backlashes

Nor did Microsoft seem to anticipate the backlashes to any of the three changes — another failure, said experts.

“Consumers are more vocal now,” said Miller, citing social media’s amplification of complaints. “In each case, Microsoft made a bold statement, but then had to rescind it after a Twitter outcry.”

LaMotte of Levick concurred. “If you release something with new functionality, consumers are prone to give feedback fast and furious. That’s the benefit and the downside of social media.”

That was especially true in the Xbox One affair because of the unique nature of the gaming community. Gamers are passionate about what they want, identify personally with the software, much more so than, say, users of Windows or Office. And they’re already organized, so to speak, because of the way many network to play online.

They’re also a different demographic group, LaMotte argued, one that likes to complain.

“Gamers love to share their opinion and share their disgust,” LaMotte said. “But Millennials are especially vocal about what they don’t like. It’s almost as if the movement picked up steam just to make Microsoft reverse the decision, no matter what an individual thought. People who grew up in the 60s or 70s, 80s and 90s, they had things to rebel against. Millennials don’t. So they find things to rebel against.”

Patrick Moorhead, principal analyst with Moor Insights & Strategy, saw Microsoft’s moves differently. “Sometimes companies believe that they’re smarter than everyone else,” he said. “Apple made that strategy feasible by being successful without soliciting consumer reaction. I think Microsoft saw that and said, ‘That’s how we can be successful, too.'”


Microsoft isn’t Apple

Moorhead found hints of Microsoft’s mimicry in its relatively-recent penchant for secrecy, a change itself from decades of being far more transparent. “Microsoft could have done a better job [in these cases] by asking people beforehand,” Moorhead asserted. “But they’ve become more isolated, more … insular … as it relates to people who they used to get feedback from, like analysts and the press.”


What works for one company, Apple for example, doesn’t necessarily work for another, like Microsoft. “Their initial reaction [to critics] of Windows 8 was that ‘We know better,'” said Moorhead. “There was no admission that they’d made a mistake or flexibility whatsoever.”


That didn’t go down well.

On the Xbox One, at least, Morton thought that Microsoft’s miscue may have stemmed from incorrect assumptions of the market. In its initial presentation of the Xbox One, Microsoft focused on the device’s non-gaming traits, particularly its television viewing features. “They build up a house of cards with an incorrect assumption of who would be the purchaser,” Morton said.

Sony, which portrayed its new console, the PlayStation 4, as the anti-Xbox One, pitched its hardware to gamers, as a game machine, with the hope that others in the family would use it, too. Microsoft’s mistake was taking the opposite tack.


Lessons learned?

Even with the missteps, several of the experts said, there’s evidence that Microsoft has learned lessons. Some encouraged Microsoft not to give up on its long-term strategy, even in the face of the three failures.

“They were shooting for the future,” said Miller, of the original Xbox One and Windows 8 decisions. “And I agree with them. They had to do the changes.” It’s inevitable, he said, that games will go all digital, all served via downloads, and that Microsoft’s Windows 8 shift to emphasize mobile was necessary to stay relevant.

Moorhead believed Microsoft has improved its responses to faux pas, even in the last few months. “I do get a sense recently that Microsoft’s taken a softer tone, and admitted that they didn’t get it right,” said Moorhead, referring to the Xbox One and Windows 8 retreats. “The addition of the Start button [to Windows 8.1] was at least some admission that they’re not perfect.”

But Miller wondered what the reaction to Microsoft’s moves meant in the long term, and not just for the Redmond, Wash. developer. “The world may not be as ready for cloud services as some might want them to be,” Miller said, pointing to Xbox One. “If [Xbox One and Office] are indicative of Microsoft’s longer-term goals, are they achievable? And will consumers follow?”

None of the experts dared predict the exact nature of the future, but pointing to the pain of change, some cautioned other companies to learn from Microsoft’s experiences. “It’s the times in which we live,” said LaMotte. “If you’re going to allow the world to beta test your products, you’d better be ready for the feedback.”

This article, What’s the matter with Microsoft?, was originally published at



Highlights of Obama’s plan to cut carbon

Washington Post

June 25, 2013

President Obama unveils a broad plan aimed at curbing climate change and its impacts in a speech Tuesday at Georgetown University.

President Obama’s full climate action plan (PDF)


Curbing carbon pollution

• Directs the EPA to establish carbon pollution standards for both new and existing power plants.

• Promises $8 billion in loan guarantees for fossil fuel projects.

• Directs the Interior Department to permit 10 gigawatts of wind and solar projects on public lands by 2020.

• Expands the president’s Better Building Challenge, helping buildings cut waste to become at least 20 percent more energy efficient by 2020.

• Sets a goal to reduce carbon pollution by at least 3 billion metric tons cumulatively by 2030 through efficiency standards set for appliances and federal buildings.

• Commits to developing fuel economy standards for heavy-duty vehicles.

• Aims to reduce hydrofluorocarbons, highly potent greenhouse gases.

• Directs agencies to develop a comprehensive methane strategy.

• Commits to forests and other landscape protection.


Preparing for climate change

• Directs agencies to support local investment to help vulnerable communities become more resilient to the effects of global warming.

• Establishment of flood-risk reduction standards in the Hurricane Sandy-affected region.

• Will work with the health-care industry to create sustainable, resilient hospitals.

• Distribution of science-based information for farmers, ranchers and landowners.

• Establishment of the National Drought Resilience Partnership to make rangelands less vulnerable to catastrophic fires.

• Climate Data Initiative will provide information for state, local and private-sector leaders.


Leading global efforts to address climate change

• Commits to expanding new and existing initiatives, including those with China, India and other major emitting countries.

• Calls for the end of U.S. government support for public financing of new coal-fired power plants overseas.*

• Expands government capacity for planning and response.

*Except for efficient coal plants in the poorest countries, or for plants using carbon capture.



Will Congress Let USAF Abandon the Global Hawk?

Defense News


Jun. 24, 2013 – 06:00AM |

By ARAM ROSTON         


The Global Hawk has provided high-altitude, long-endurance ISR for the Air Force since the late 1990s, but the service says it no longer needs the unmanned aircraft.

June is the start of the rainy season in the South Pacific, six months of storms that come in fast and unpredictable. And when the wind starts blowing, that takes its toll on U.S. intelligence-gathering far off in North Korea.

A substantial amount of the intel on the Hermit Kingdom comes from the three massive Global Hawk unmanned surveillance planes based at Andersen Air Force Base in Guam. Because of special flight restrictions, the Global Hawks can’t fly over thunderstorms, nor, without a way to see the clouds ahead, can they go around them. So whenever a hint of bad weather arose on the route Global Hawk was assigned last year from Guam, the missions were canceled. Last year, the UAVs were grounded for an entire month, says a source with knowledge of the operation.

This susceptibility to South Pacific cyclones is adding new energy to the political hurricane raging in Washington over the future of the expensive UAVs.

It’s been a year and a half since the Air Force said it no longer needs the Global Hawk. The service argued that the UAVs, each built for more than $200 million, don’t do their jobs as well as the time-tested U-2 manned spy plane. So the Air Force wants to take the entire fleet of 18 Global Hawks and park them in the “boneyard” — the aircraft storage facility at Davis-Monthan Air Force Base, Ariz. That’s the functional equivalent of throwing 135 tons of the world’s most advanced robotic flying machines into the trash heap.

Now the battle lines are forming in what may be an epic contracting war. On the one side, swinging hard, is Global Hawk-maker Northrop Grumman. It has some powerful arguments, and it has members of Congress who say the Air Force needs to fall in line. On the other side is the Air Force, fighting to keep the U-2, which was built by Lockheed Martin.



At 70,000 feet, a U-2 pilot flying northwest along the boundary of North Korean airspace can turn his head to the right, and through the visor of his spacesuit he will see the silhouette of Earth’s curvature. Then he will see a silent green phosphorescent flash before the sky suddenly goes dark.

They call that flash “the terminator.” No U-2 pilot ever forgets it. Until just two years ago, the U-2 program itself — the workhorse of high-altitude intelligence, surveillance and reconnaissance for 60 years — was due to be terminated, too.

For a time, the Global Hawk versus U-2 debate revolved around age. The U-2, its critics said, was of a different era, before UAVs. After all, any pilot flying the U-2 now wasn’t even born when the program started back in 1955.

But now, as one Air Force pilot points out, “This is not your grandfather’s U-2.” For example, today’s U-2S jets have pressurized cockpits, although the pilots still wear spacesuits in case anything goes wrong.

Lockheed Martin’s Robert Dunn said the U-2S has a long way to go before it needs to be decommissioned. “The airplanes we are flying today are certified to 75,000 flight hours. The average airframe is 14,000,” he said.


If the U-2 is the aging champion, then in the other corner of the ring is the upstart Global Hawk. A feat of modern engineering, the autonomous plane can fly for 32 hours straight when conditions are right. That’s far longer than the U-2, though not as high and with a smaller payload.

Ironically, the now-costly Global Hawk program was birthed during the cutbacks of the Clinton years. The Air Force was enthusiastic about its huge, high-flying UAV, and it pushed for more and more capacity for the planes. The first operational lot, the Block 10s, couldn’t carry enough weight, so the next generation was bigger and more ambitious. It was about more sensors, more power, more payload.

Initially pitched as a $35 million aircraft, costs ballooned over the years by 284 percent, according to the Congressional Research Service. Much of that was due to the Air Force’s shifting requirements. (It’s now estimated at about $220 million per plane including development costs.)

The Air Force, for a time, was the Global Hawk’s biggest cheerleader, although the history has been complex and sometimes contradictory.

In early 2011 for example, the Defense Department’s director of operational test and evaluation said “the system was not operationally effective for conducting near-continuous, persistent ISR operations.”

Then, in June 2011, shortly before the Global Hawk was fielded, Air Force officials certified the project as “essential to national security.” It was meant to ensure that Congress continued to fund the program, but the proclamation would begin to haunt the service just months later.



In January 2012, the Air Force announced a drastic turnaround: It would terminate the Global Hawk program.

It provoked a firestorm — and a heavy public advocacy campaign on Capitol Hill by those who support the plane. Like many major modern weapons, its subcontractors are widely distributed across the United States, ensuring a broad base of political support. Northrop Grumman’s website notes that all but 15 states manufacture some part of the Global Hawk.

Experts were confounded that the Air Force had changed its mind so quickly.


And Congress put its foot down.

In the 2013 National Defense Authorization Act signed earlier this year, Congress told the Air Force it would have to fly the Global Hawks it had already (16 plus two being built) through the year 2014. The service “shall maintain the operational capability of each RQ-4 Block 30 Global Hawk unmanned aircraft system belonging to the Air Force or delivered to the Air Force.”

And to make sure no Global Hawk went on to the boneyard, the act was specific: No money “may be obligated or expended to retire, prepare to retire, or place in storage an RQ-4 Block 30 Global Hawk unmanned aircraft system.”

All of which sets the stage for the current conflict on the Hill.

Meanwhile, the 2013 Defense Appropriations Act went further. The service had resisted ordering new planes, on the assumption that by the time they were delivered, they’d be going right to the boneyard. Now the Air Force was told to go order three of the planes that had previously been budgeted for in 2012. “The Secretary of the Air Force shall obligate and expend funds previously appropriated,” for the plane.


But the Air Force has resisted. As another officer said, “Why are they making us spend money on something we don’t want or need?”

That attitude has irked some Northrop Grumman supporters on Capitol Hill.

In May, Rep. James Moran, D-Va., and Rep. Buck McKeon, R-Calif., wrote a stinging letter to Defense Secretary Chuck Hagel demanding that the Air Force do what it was told.

“The Air Force has continued to ignore clear Congressional intent,” they said.

And the House Armed Services Committee in June voted for a new defense authorization bill that would force the Air Force to use the Global Hawks until 2016



Here’s a side-by-side comparison of the two platforms:


■ Power. The U-2’s engine, with 17,000 pounds of thrust, can push the plane beyond 65,000 feet within a half hour. “It climbs like a homesick angel,” said a U-2 pilot. The Global Hawk, powered by an engine with just 7,500 pounds of thrust, can take four hours to reach its ceiling of 60,000 feet, critics say.


■ Endurance. Global Hawk is the hands-down winner. It can fly up to 32 hours before returning to base. Some say that’s what matters. “This is no time to be getting rid of your long-range, long-endurance assets,” said Rebecca Grant, an analyst who has done work for Northrop Grumman. The U-2 is stretching it to fly 14 hours; more typical flights last 10. But its defenders note that the manned plane can be based closer to the action, say, in South Korea, where flight restrictions bar unmanned aircraft.


■ Altitude. Here, U-2 is the king, with a publicly disclosed ceiling of 70,000 and a true ceiling somewhere about 75,000 feet. Global Hawk tops out at 60,000 feet. For the Air Force, this has become the central issue. First, the U-2 gets above the weather. The worst storm in the world is “just fireworks below,” said a pilot. But the other issue is visibility. Simple geometry allows the U-2 to see farther into enemy territory than the Global Hawk. That really makes a difference. A ceiling of 60,000 feet versus 70,000 doesn’t sound like much but look at it this way: The main job of the plane in the near future will be flying over the borders of countries like China and North Korea from international airspace. The Air Force likes to see 80 or 100 miles into adversaries’ territory, and the U-2’s added height lets it do that.


■ Sensors. That’s what it’s all about. At first glance, the Global Hawk has the edge. It carries three sensors for its intelligence missions, and the U-2 carries only two. On top of that, the Global Hawk can switch in midflight between electro-optical and synthentic aperture radar. “To have the ability for a single weapons system to carry a SAR radar, electro-optical package, and SIGINT package,” said Tom Vice, Northrop Grumman’s president of Aerospace Systems, “it allows to you to fuse all three different types of intelligence products together at the same time.”


But the Air Force says the U-2 has a far better electro-optical sensor that gives it a hands-down win in the category. In a report to Congress this spring, the Air Force flatly said that “the current U-2 sensors are superior to those of the GH.” Key to that is a camera called SYERS II (Senior Year Electro-optic Reconnaissance System) manufactured by UTC Aerospace. It’s multispectral, unlike the Global Hawk’s camera, and it sees farther.


■ Price. The U-2s were all built years ago. It’s a bit like owning a 2000 Honda Accord — it’s already paid for, it will keep on going and it drives great. The Global Hawks, on the other hand, are still coming off the production line. But Northrop Grumman argues that most of the development costs have already been spent anyway, and the kinks of building a new system have only recently been ironed out. The Air Force says at this point that it is just spending good money on a system that doesn’t have what it takes.


As for operating costs, they are equivalent — $33,500 per hour. But as Northrop Grumman points out, the Global Hawk doesn’t need training flights and requires fewer takeoffs and landings. Even the Air Force, in a recent report, acknowledged that “the persistence advantage of [Global Hawk] manifests itself in lower execution costs.”

Among its various proposals, Northrop Grumman has made one that stands out. It is offering to provide a 10-year contractor logistics contract for the Global Hawk Block 30 for $250 million, as a fixed price. It made the offer, though, months after the Air Force decided to terminate the program.



There is much disagreement on how much it would cost to upgrade the Global Hawk Block 30s, where there are shortfalls that need addressing. Take the sensors. The Air Force reported to Congress that “Upgrades to the GH Block 30 to achieve parity with the U-2 program require an expenditure of approximately $855 million.”

It might not be able to fly as high, but at least it could photograph as clearly.

Northrop Grumman’s defenders, eager to get the Air Force to change its mind, say the service is way off the mark. The company has offered to put better cameras on the Global Hawk for just $48 million.

“We’ve looked at that and we’ve addressed it,” Vice said. “We looked at how to open up our architecture. We’ve offered a firm fixed-price offer to the U.S. Air Force to integrate the SYERS sensors onto Global Hawk. And that would cost the Air Force only 6 percent of what the Air Force believed it would cost to upgrade the current Block 30 cameras. Guaranteed price; no risk to the government.”

Northrop Grumman’s $48 million versus the Air Force $855 million is an unresolved discrepancy, for the moment. One reason it can work: The company wants to simply remove the cameras from the competition — essentially cannibalizing the U-2.

As for the Global Hawk’s getting grounded in places like Guam, where it can’t be relied on during the rainy season, the plane’s supporters say that’s the Air Force’s fault in the first place because of onerous restrictions. Supporters argue that requiring the plane to fly 10,000 feet over clouds, and limiting it to one route was the problem that caused it to be grounded excessively.

Now it’s been given alternative routes, which supporters say will cut back on canceled missions.

The difficulty has been that Global Hawk is unmanned, without “sense and avoid” technology to meet air traffic requirements. Normally, a pilot could see the clouds and steer around them, but without a pilot, the Global Hawk can’t do that.

Northrop Grumman has told the Air Force it can put “weather diversion” cameras in the Guam-based Global Hawks. That way, the operators back at base will be able to see the clouds and reroute, just as they could if the pilot was flying.

The company pitched the idea to the Air Force, offering to install the cameras for $7 million.



There are some analysts who believe that in spite of the Global Hawk’s shortfalls, the Air Force is making a mistake. .

“However you cut it, I think there is a good case for Global Hawk Block 30,” says Mark Gunzinger of the Center for Strategic and Budgetary Assessments. “The reasons cited for retiring the Block 30s don’t stand up under scrutiny. It’s worth questioning.”

But if the Air Force is really being disingenuous in terminating the Global Hawk, as its critics say, what would be the motive? That’s where the Northrop Grumman defenders are having a difficult time.

Is it, perhaps, a lingering bias against drones, a preference for the swaggering days of the piloted plane? At a House hearing in May where he castigated the Air Force for its decision on Global Hawk, Moran said as much: “The U-2, as you know, has a pilot. And I suspect that’s the real issue — the pilotless versus the piloted craft, even though the U-2 has been around longer than even some of the members of this subcommittee have been alive.”

Air Force Chief of Staff Gen. Mark Welsh protested: “Pilot being in the airplane had absolutely nothing to do with it. I couldn’t care less. We want the platform that will do the best job of accomplishing the mission assigned — manned or unmanned — and we’ve said that all along.”

And after all, the Air Force has hundreds of UAVs and continues to develop new ones. It’s a hard to argue that the service simply doesn’t like unmanned aviation any more.

If not a bias against planes, others say that it is just stubbornness: The Air Force has dug itself into an untenable position and because of bureaucracy, is unwilling to back down, they say.

Still, that does seem like a stretch, given what’s at stake. If the Air Force still says it doesn’t need to spend the hundreds of millions of dollars on a program it finds inadequate, it will be hard to argue with that in an era when sequestration is cutting everyone’s budget.


The Atlantic

What Your Email Metadata Told the NSA About You

Rebecca Greenfield Jun 27, 2013

President Obama said “nobody is listening to your telephone calls,” even though the National Security Agency could actually track you from cellphone metadata. Well, the latest from the Edward Snowden leaks shows that Obama eventually told the NSA to stop collecting your email communications in 2011, apparently because the so-called StellarWind program “was not yielding much value,” even when collected in bulk. But how much could the NSA learn from all that email metadata, really? And was it more invasive than phone data collection? The agency is well beyond its one trillionth metadata record, after all, so they must have gotten pretty good at this.

To offer a basic sense of how StellarWind collection worked — and how much user names and IP addresses can tell a spy about a person, even if he’s not reading the contents of your email — we took a look at the raw source code of an everyday email header. It’s not the exact kind of information the NSA was pulling, of course, but it shows the type of information attached to every single one of your emails.

Below is what the metadata looks like as it travels around with an email — we’ve annotated the relevant parts, based on what The Guardian reported today as the legally allowed (and apparently expanded) powers of the NSA to read without your permission. After all, it’s right there behind your words:

As you can see, at the bare minimum, your average email metadata offers location (through the IPs), plus names (or at least email addresses), and dates (down to the second). The Guardian‘s Glenn Greenwald and Spencer Ackerman report that Attorney General Michael Mukasey and Defense Secretary Bob Gates signed a document that OK’d the collection and mining of “the information appearing on the ‘to,’ ‘from’ or ‘bcc’ lines of a standard email or other electronic communication” from, well, you and your friends and maybe some terrorists.

But email metadata is more revealing than that — even more revealing than what the NSA could do with just the time of your last phone call and the nearest cell tower. For operation StellarWind, it must have been all about that IP, or Internet protocol, address. Hell, it’d be easy enough for your grandma to geolocate both parties from a couple of IPs: there are countless free services on Google that turn those numbers you give to the IT guy into your exact location. For example, using the two IP addresses in the email sent to me above, we can easily determine that it was sent from Victoria, Australia:

The IP address is like a homing pigeon, and that’s why the revelations of email metadata being authorized under the Bush and Obama administrations amounts to a seriously revealing breach of personal security in the name of terror-hunting. “Seeing your IP logs — and especially feeding them through sophisticated analytic tools — is a way of getting inside your head that’s in many ways on par with reading your diary,” Julian Sanchez of the Cato Institute told The Guardian. Of course, the administration has another party line, telling the Los Angeles Times that operation StellarWind was discontinued because it wasn’t adding up to enough good intelligence of “value.” But with one of the many “sophisticated analytic tool” sets developed by the NSA over the last decade or so and leaked during the last month — like, say, EvilOlive, “a near-real-time metadata analyzer” described in yet another Guardian scoop today — America’s intelligence operation certainly can zero in on exactly where Americans are. Even if you’re just emailing your hip grandma.


Too much collaboration is hurting worker productivity



By Vickie Elmer    June 26, 2013    

Companies love collaboration—it’s become the go-to approach to solve corporate problems and spur innovation. Yet by emphasizing it at the expense of solitary work, employers choke worker productivity and satisfaction.

A new report by Gensler, the global workplace design and architecture firm, found that workers are spending more time in focus work but feel less effective at it than before.

“Collaboration can be taken too far. It actually has diminishing returns,” said Diane Hoskins, co-CEO of Gensler, in an interview with Quartz. “When everybody’s collaborating around you, you can’t focus.”

In the last few years, increased collaboration is both intentional, encouraged by managers intent on fostering innovation and shared resources, and unintentional, partly the result of corporate cutbacks in office space during the recession. Much of the reduced space affected collaboration areas, which pushed conversations and collaboration into the general work spaces, said Hoskins. “Everything was squeezed” and so workers felt less able to focus, the Gensler survey of 2,035 knowledge workers shows.

Now they feel even more crowded and unable to focus as corporate payrolls have inched up. Some feel that with more workers on “kitchen tables” or desks in close proximity they can never get anything done.

“If you diminish focus, it’s like the house of cards starts to fall apart. It’s almost foundational,” she said.

Certainly other research has found open floor plans can make workers less productive and more likely to get sick.

So what are companies doing to create places where workers can concentrate on their tasks? Intel’s Software and Services Group gives workers four hours of “think time” tracked on a group calendar so they can block out distractions and tune in on important problems or work. Office furniture maker Steelcase has created a gathering space equipped with teleconferencing devices, information projections and a round table.

Gensler is encouraging balanced arrangements so workers can have a few different environments to use depending on the mode they’re in. One company put up a C-shaped pod in the middle of the desks with room for four or five people to sit and share. Others have created small meeting rooms “where you can take conversations to” so workers at their desks can focus, Hoskins said. Some even set up outdoor gathering places for informal meetings or break times together.

Or some just may go home to get quiet focused time, and then come into the office for meetings and social connections. Those who can choose where to work still spend about 70% of their time in the office, Gensler reported. Workers who can choose their environment are more satisfied with their job, and rate themselves as more effective, especially in their focus work.

DirecTV has the edge right now among bids to buy Hulu


By Gina Chon and Kevin J. Delaney

June 28, 2012

Satellite television provider DirecTV appears to have the leading bid for video streaming service Hulu, according to sources familiar with the sale process.

Other contenders include Guggenheim Digital Media and a joint bid from the Chernin Group and AT&T, the sources said. Final, binding offers for Hulu had been due today, but the deadline has been extended to Tuesday, in part to give DirecTV more time to assemble its acquisition package.

The competition for Hulu, which wasn’t able to fetch significant offers when it first went up for sale in 2011, demonstrates how various segments of the media industry suddenly see value in owning their own platforms. Hulu has about 4 million subscribers paying about $8 a month for a mix of movies, TV shows, and original content. It also offers some video for free.

Sources say DirecTV has an advantage because, as one of the largest multi-system operators (MSOs) in the United Sates, it’s already a major customer of Hulu’s owners—Disney, Comcast, and News Corp. DirecTV could use Hulu to diversify its offerings or even to create a cable service delivered entirely over the internet. It currently operates over satellite as well as the web.

But the Chernin Group, led by former News Corp. president Peter Chernin, and Guggenheim, headed up by former Yahoo interim CEO Ross Levinsohn, are seen as more entrepreneurial. That could help ease concerns about Hulu losing its innovative spirit and strong engineering team if it were sold to a corporate buyer like DirecTV.

Hulu owners are hoping to fetch a price in the $1 billion range and if they don’t get such offers, they may rethink a sale, sources said. But at least some of the bids are expected to hit that price range, sources said. Yahoo, Amazon, and private equity firm Silver Lake also considered separate bids for Hulu, but their interest has waned, sources say.

There’s also still a possibility that Hulu’s owners avoid an outright sale by bringing in another media company such as Time Warner Cable or Time Warner, through an investment. Some analysts have advised against selling. Part of the reason Hulu is on the block, though, is that its owners have disagreed over what strategy to pursue.

Hulu declined to comment.


What They Told Us: Reviewing Last Week’s Key Polls

Rasmussen Reports

Saturday, June 29, 2013

With the nation’s 237th  birthday just days away, it’s good to remind ourselves that for most Americans, there’s still no place like home.

Eighty-six percent (86%) are proud to be an American. Seventy-four percent (74%) believe, generally speaking, Americans should be proud of the history of the United States.

Seventy percent (70%) believe that Americans have more freedom than people in other countries. Sixty-nine percent (69%) feel Americans have more economic opportunity than people in other countries.

Not that we wear rose-colored glasses. Just 40% now agree with Abraham Lincoln and Ronald Reagan that the United States is “the last best hope of mankind.” Thirty-six percent (36%) disagree.

Only 45% of voters believe the U.S. justice system is fair to most Americans, and just 34% think the system is fair to the poor.

In his latest weekly newspaper column, Scott notes that in our country, “public opinion leads, and politicians lag a decade or two behind. It’s always been that way.” He adds that “it’s easy to get discouraged about politics in the 21st century. Most voters view our government as a threat to individual rights. But we can take hope from the fact that Americans still embrace the ‘Spirit of ’76.’ “

Seventy-two percent (72%) of voters, however, suspect that the National Security Agency may have violated one of the country’s most cherished constitutional standards – the checks and balances between the three branches of government – by spying on the private communications of Congress and judges.

Recognizing that there is a tension between national security and individual rights, just 12% believe the recently disclosed NSA surveillance program is too concerned about individual rights. Forty-three percent (43%) believe the program is tilted too far in favor of national security.

The government may be understandably confused, however, over which domestic terrorists it’s supposed to be tracking. Among voters who approve of the president’s job performance, just 29% see radical Muslims as the bigger terrorist threat to the United States. Twenty-six percent (26%) say it’s the Tea Party that concerns them most. Among those who Strongly Approve of the president, more fear the Tea Party than radical Muslims. As for voters who disapprove of Obama’s performance, 75% consider radical Muslims to be the bigger terrorist threat.

If the government is listening, it doesn’t appear to be hearing to what most Americans are saying.

Take illegal immigration, for example. Sixty percent (60%) of voters favor an immigration plan that gives those here illegally legal status to stay, provided the border is really secured. But just 28% are now even somewhat confident that the government would actually secure the border and prevent illegal immigration if the reform plan is passed. That’s down from 45% in January.

In large part because of this voter distrust, just 37% expect the immigration plan passed by the Senate Thursday to make it through the full Congress and become law.

Voters are evenly divided as to whether the determination that the border is secure should be made by border states or by the federal government.

The immigration plan passed by the Senate also includes quite a few “pork barrel” spending projects, even though 65% of voters continue to believe the government should cut spending rather than increase it to help the economy.

Most voters also still view nuclear weapons as critical to the country’s safety which is why just 27% agree with President Obama’s call on Monday for a reduction in the U.S. nuclear arsenal.

Voters disagree, too, with the president’s decision to involve the United States more deeply in the civil war in Syria. U.S. troops haven’t been sent in there yet, but most voters continue to believe American political leaders put U.S. troops in harm’s way too often.

The president this past week proposed tighter carbon dioxide emissions controls on existing power plants. Nearly half of voters like that idea, but just as many think the proposed regulations will hurt the economy. Most expect those regulations to drive up energy costs. 

The economy and health care again this month top the list of 15 voting issues regularly tracked by Rasmussen Reports, while the environment and the war in Afghanistan are among the issues voters regard as least important.  Interestingly, though, for the first time in nearly two years, over half (51%) of voters say the immigration issue is Very Important in terms of how they will vote in the next congressional election.

Republican Senator Marco Rubio has been the most visible advocate of the Senate immigration proposal. Over the past several months, Rubio’s favorability ratings among Republican voters have fallen dramatically.

Despite continuing high consumer and investor confidence in the economy, the president’s job approval ratings also remain down from the highs he hit just after Election Day.

Confidence in the short-term housing market remains higher than it has been in several years, although just 50% now say their home is worth more than they still owe on their mortgage. That’s down 10 points from 60% in May. 

Most homeowners (56%) still say their home is worth more than when they bought it, but one-in-four now say it’s worth less.

In other news this week:

– Thirty percent (30%) of Likely U.S. Voters now say the country is heading in the right direction.

– Support remains high for allowing Americans to choose the level of health insurance they want based on how it impacts their pocketbooks.

– Americans still don’t have much confidence in the Federal Reserve Board to keep inflation under control.

– Americans have surprisingly similar overall views of their relationship with their spouse and relations with their co-workers.

June 22 2013




NSA Implementing ‘Two-Person’ Rule To Stop The Next Edward Snowden

Andy Greenberg, Forbes Staff

6/18/2013 @ 2:23PM


On Tuesday, National Security Agency Director Keith Alexander told a congressional hearing of the Intelligence Committee that the agency is implementing a “two-person” system to prevent future leaks of classified information like the one pulled off by 29-year-old Booz Allen contractor Edward Snowden, who exfiltrated “thousands” of files according to the Guardian, to whom he has given several of the secret documents.

After NSA Leaks, Senators Re-Introduce Bill To Reduce Patriot Act Secrecy Andy GreenbergAndy Greenberg Forbes Staff

Intel Leaker Edward Snowden Attacks NSA’s Distinction Between Americans And Foreigners Andy GreenbergAndy Greenberg Forbes Staff

As Bradley Manning’s Trial Begins, Attorneys Spar Over His ‘Arrogance’ Versus ‘Good Intentions’ Andy GreenbergAndy Greenberg Forbes Staff

“We have to learn from these mistakes when they occur,” Representative Charlies Ruppersberger said to Alexander in the hearing. “What system are you or the director of national intelligence administration putting into place to make sure that if another person were to turn against his or her country we would have an alarm system that would not put us in this position?”

“Working with the director of national intelligence what we’re doing is working to come up with a two-person rule and oversight for those and ensure we have a way of blocking people from taking information out of our system.”

That “two-person rule,” it would seem, will be something similar to the one implemented in some cases by the military after Army private Bradley Manning was able to write hundreds of thousands of secret files to CDs and leak them to WikiLeaks. The rule required that anyone copying data from a secure network onto portable storage media does so with a second person who ensures he or she isn’t also collecting unauthorized data.

It may come as a surprise that the NSA doesn’t already have that rule in place, especially for young outside contractor employees like Snowden. But Alexander emphasized that Snowden was one of close to a thousand systems administrator–mostly outside contractors–who may have had the ability to set privileges and audit conditions on networks.”This is a very difficult question when that person is a systems administrator,” Alexander responded. “When one of those persons misuses their authority it’s a huge problem.”

Alexander added that the system is still a work in progress, and that the NSA is working with the FBI to collect more facts from the Snowden case and to implement new security measures in other parts of the U.S. intelligence community.

When asked how Snowden had gained such broad access to the NSA’s networks despite only working for Booz Allen for three months, Alexander said that he had in fact held a position at the NSA for the twelve months prior to taking that private contractor job.

The questions about the NSA’s lack of leak protections came in the midst of a conversation that largely focused on the NSA’s justification for the broad surveillance those leaks revealed. In the hearing, Alexander claimed that more than 50 attacks have been foiled with some help from the NSA’s surveillance programs such the collection of millions of Americans’ cell phone records and the collection of foreigners’ Google-, Facebook-, Microsoft- and Apple-held data known as “PRISM,” both disclosed in Snowden’s documents. One newly-revealed bombing plot targeted the New York Stock Exchange, and another involved an American donating money to a Somalian terrorist group.

Of those more than 50 total cases, ten of those plots involved domestic collection of phone records, according to Alexander. But when Representative Jim Himes questioned in how many cases that collection was “essential,” his question went unanswered.

Alexander also fended off criticisms that the Foreign Intelligence Surveillance Act court system, which oversees the NSA’s requests to use data it’s collected–often from Americans–is a “rubber stamp process” that approves nearly all of the NSA’s actions. That court reported in April that it had received 1,789 applications for electronic surveillance in an annual report to Congress. One request was withdrawn, and forty were approved with some changes. The other 1,748 others were approved without changes.

“I believe the federal judges on that court are superb,” Alexander told Congress. “There is, from my perspective, no rubber stamp.”

But a significant portion of the hearing also focused on the NSA’s security vulnerabilities highlighted by Snowden’s leaks, rather than its surveillance. Representative Michelle Bachmann emphasized that the NSA should answer “how a traitor could do something like this to the American people,” and how to “prevent this from ever happening again.” She asked Alexander how damaging the leaks were to the NSA’s mission, and he responded that they were “significant and irreversible.”

Snowden has taken refuge in Hong Kong, where he conducted a live Q&A on the Guardian’s website Monday. In that conversation, he wrote that “the consent of governed is not consent if it is not informed,” and that “truth is coming, and it cannot be stopped.”

At the hearing, a member of the committee ended with a personal question about that young leaker’s fate: What’s next for Snowden?

FBI deputy director Sean Joyce answered, simply, “Justice.”


Government Executive Analysis: Are 50 Foiled Terrorist Plots Worth Your Privacy?

By Brian Fung

June 18, 2013


In the most candid explanation of the National Security Agency’s surveillance program to date, agency head Gen. Keith Alexander said Tuesday that his organization’s listening activity has helped foil more than 50 terrorist plots against the United States and its allies. One of those involved Najibullah Zazi’s attempt to blow up the New York City subway; another concerned an early-stage plan, news of which was previously withheld from the public, to blow up the New York Stock Exchange.

Alexander and other witnesses before the House Intelligence Committee made sure to highlight key details of these foiled attacks. Understandably so: The more we focus on the program’s successes, the less harshly we might be inclined to judge its alleged excesses. But what exactly is the tradeoff being made here, and how do these revelations address concerns about the potential for NSA over-spying?

We now know at least this much: Of the millions of phone numbers that the NSA could summon for intelligence purposes under Section 215 of the Patriot Act, about 300 specific numbers were actually “queried” from the database in the course of federal investigations. (This database is made up of the same telephone metadata Verizon has been handing over to the government on a daily basis since 2006, the subject of NSA leaker Edward Snowden’s initial bombshell.) Twenty-two officials are responsible for approving these queries within the NSA, and the agency doesn’t require special court authorization to inspect the phone numbers in question.

The NSA says it takes steps to rule out accidental snooping on U.S. citizens. Among the most important? Inspecting the area code, said John Inglis, the NSA’s deputy director. If the number begins with “301,” for example, you’d know it was a Maryland number.

“That would be your only insight as to whether this would be attributable to a U.S. person,” Inglis said, adding that if the person was simply exercising their First Amendment right to free speech, “that is not a reason to approve a query.”

Inglis didn’t elaborate on how the NSA would go about determining whether the subject was in fact exercising free-speech rights (Would they send an agent to interview the suspect? Tap his lines?), but he assured lawmakers that the phone-records program had “a very narrow purpose.”

(It’s not entirely clear, but these minimization procedures seem to be separate from those applied to PRISM, the NSA surveillance program that summons data from tech companies under Section 702 of the Foreign Intelligence Surveillance Act.)

Yet despite Inglis’s insistence that from this database the NSA can only retrieve phone numbers—and that “queries” approved by the 22 agency officials would return only those phone numbers that the first number dialed—statements by other officials at the hearing seemed to suggest that connecting metadata to specific individuals was a trivial matter.

Take one of the four cases the officials highlighted as plots disrupted by either the 215 program (the phone records) or the 702 program (the business records). Using the 215 program, said FBI Deputy Director Sean Joyce, law-enforcement officials were able to positively identify an individual who was providing financial assistance to a listed terrorist group in Somalia. Not only that, but the FBI was also able to identify that suspect’s coconspirators.

We already know that metadata acts as a kind of digital fingerprint, and it takes little more than a couple of data points to connect anonymized metadata to a specific person.

Even President Obama hinted at how flimsy the “it’s only metadata” argument is in his interview with Charlie Rose last night:

“[Critics will] say, you know, ‘You can—when you start looking at metadata, even if you don’t know the names, you can match it up, if there’s a call to an oncologist, and there’s a call to a lawyer, and—you can pair that up and figure out maybe this person’s dying, and they’re writing their will, and you can yield all this information.’ All of that is true. Except for the fact that for the government, under the program right now, to do that, it would be illegal.”

That any abuse of the system would be treated after the fact as a crime doesn’t do anything to assuage Americans worrying that the crime is possible in the first place. It’s also not outrageous to say, as my colleague Conor Friedersdorf does, that the tradeoff we’ve made between liberty and security is out of balance, and that maybe we’ve let our fear of terrorism get the better of us.



Welsh: Sequestration continues to drain crucial capabilities from America’s Air Force

Posted 6/19/2013

by Master Sgt. Angelita Colón-Francia

Secretary of the Air Force Public Affairs


6/19/2013 – ARLINGTON, Va. (AFNS) — The Air Force’s top officer reported on the growing strain sequestration has put on readiness, personnel and modernization to a group of civic and industry leaders attending the Air Force Association’s monthly breakfast here, June 17.

Sequestration has hit the Air Force hard, said Air Force Chief of Staff Gen. Mark A. Welsh III.

Like the rest of the Defense Department, the Air Force has seen severe reductions in funding, leading to concerns about critical mission factors including the readiness of pilots and aircraft that aren’t flying today.

“We’ve got folks sitting in fighter squadrons looking out of windows at aircraft that they haven’t touched since the first of April,” Welsh said.

Currently, the Air Force has stood down 33 squadrons, 12 of which are combat-coded fighter and intelligence, surveillance, and reconnaissance units. Another seven squadrons have been reduced to a basic mission capable rating.

Even before sequestration, there was a readiness crisis in the Air Force, Welsh said. The severe cutbacks required by the sequester will further downgrade force readiness beyond the current fiscal year if a budget agreement is not reached.

“We can’t just all of a sudden accelerate training and catch up,” he said. “It costs up to 2 1/2 times as much to retrain a squadron as it does to keep it trained.”

Welsh emphasized the Air Force’s efforts to continue to save where possible.

“We’re looking for every option for where you can cut money — every modernization/recapitalization program,” Welsh said.

Regardless of reductions, the service cannot perform its air superiority mission with today’s aging F-15 and F-16 fighters, and limited number of F-22s, the chief of staff said, making the new F-35 non-negotiable.

“When we truncated our F-22 buy, we ended up with a force that can’t provide air superiority in more than one area at a time,” Welsh said. “The F-35 is going to be part of the air superiority equation whether it was intended to be, originally, or not.”

Welsh pointed out other countries will begin flying stealthy, highly-advanced fighters in the coming years, and if the U.S. doesn’t have the aircraft to counter them in a high-end fight it will be in trouble.

There’s nothing else that can do what the F-35 can, he said.

“Out there where people fight and die, for real, if a fourth-generation aircraft meets a fifth-generation aircraft, the fourth-generation aircraft may be more efficient, but it’s also dead,” Welsh said.



Government Executive Pentagon Financial Execs in Uncharted Waters, Survey Shows

By Charles S. Clark

June 19, 2013

11:54 AM ET


Budget uncertainty, low morale and obstacles to audit readiness are combining to confront Defense Department financial executives with an unprecedented storm of challenges, according to a survey released on Wednesday.


Ninety-four percent of the 1,006 respondents said the current environment of planned defense budget cuts and across-the-board cuts due to sequestration made their challenges greater than ever, according to the 11th annual survey by the American Society of Military Controllers and Grant Thornton LLP, titled “Navigating Through Uncertainty.”

The survey also found that across-the-board cuts are harming efforts to achieve the congressionally required audit-readiness of financial management programs. Cost information is vital for knowing “which cuts matter and which ones don’t,” the survey found. More than 61 percent said they had some level of confidence that their organizations would meet the 2014 deadline, as compared to nearly 13 percent who had little or no confidence in achieving this goal. But when asked about their parent organizations, the executives’ confidence in those organizations dropped more than 29 percentage points.

Respondents also said the returns on investments in information technology have not been as high as expected because the new systems added complexity and increased workload. And the managers have less choice in acquisition strategy, which may limit the benefits from contracts. They favor greater focus on elimination of redundancy than on acquisition initiatives.

“These survey results suggest there is a real risk that investments in audit readiness may not be enough to meet the aggressive audit deadlines facing Defense financial management executives,” said Grant Thornton Global Public Sector Managing Principal Srikant Sastry. “In light of limited resources, Defense agencies may need to move directly to audit before we’ll know for sure.”

The survey methodology encompassed face-to-face interviews with 35 defense financial leaders, and online responses from 1,006 defense financial professionals, including 6 percent active-duty uniformed personnel, 89 percent defense civilian employees, 1 percent retirees, and 4 percent other (primarily academics and employees of private-sector companies). Of the active-duty respondents, 58 percent were officers. Sixty-seven percent of the civilian respondents were GS-12 or above.


Government Executive Is a Sequester Tipping Point Coming?

By Niraj Chokshi

June 17, 2013


When $85 billion in broad spending cuts went into effect in March the world didn’t end. Kids weren’t kicked out of school en masse, hundreds of thousands weren’t laid off, the economy didn’t tank. Sequestration was overhyped and the deluge never came. But it may begin to pour this summer.

The across-the-board reductions may gain more visibility this summer largely as big defense cuts go into effect. Starting in early July, the Defense Department will begin 11-day furloughs for hundreds of thousands of its civilian employees nationwide, and the local reports are rolling in.

In Wyoming, more than 400 people will be affected, according to The Billings Gazette. In Montana, more than 500 at Malmstrom Air Force Base will see furloughs, according to The Great Falls Tribune. And nearly five times as many will be affected in Minnesota, according to The Minneapolis StarTribune.

Starting July 1, [Air Reserve electrician Dustin] Hawkins will be among more than 2,400 federal Department of Defense employees in Minnesota forced to work a reduced, four-day week for several months. Those furloughs will be spread across 63 sites statewide, but Duluth’s 148th Fighter Wing, Camp Ripley in Little Falls and St. Paul’s 133rd Airlift Wing — the units with the highest concentration of federal employees — face the brunt of the cutbacks.

Some federal furloughs have already begun, but the onslaught of Defense Department furloughs—roughly 680,000 civilian employees in total—will take a more noticeable toll. Under the heading “it’s just the beginning,” Bank of America economists wrote in a recent analyst note that the effects on government-worker income seem to be coming.

We expect aggregate government-worker income to decline in May given that furloughs started in late May. The first day of government wide furloughs was on May 24, when roughly 115,000 federal workers, or 5% of the total federal work force, stayed home without pay. However, with the majority of the furloughs not kicking in until the beginning of July, including the Pentagon’s 680,000 furloughs beginning July 8, the real income shock will not show up until the July personal income and outlay report on Aug. 30.

Other economists have said that the furloughs will begin to affect growth this quarter. And the effects of sequestration won’t be manifested in furloughs alone. Even nongovernment employees will feel the effects as July Fourth celebrations are canceled in several states.


Top Secret Defense Contractors No Longer Being Reinvestigated

By Kedar Pavgi

June 14, 2013


“Funding shortfalls” are forcing the Defense Department to suspend most periodic reinvestigations of contractors cleared for top-secret status in some national security jobs starting Friday through the remainder of fiscal 2013, according to a recent announcement.

The Defense Security Service, the agency that manages the reinvestigations, said the cuts would affect “most top secret periodic reinvestigations” but would exempt reinvestigations for contractors recognized as “key management personnel” and those needed for priority programs.

“Requests for initial personnel security clearances and Secret PRs are not affected by the suspension,” the agency said.

The agency also noted that the reinvestigations of industry personnel with access to “mission essential” intelligence would also be exempted. For example, the cuts will not affect people in positions similar to that held by recent National Security Agency leaker Edward Snowden, a top-secret cleared employee with contractor Booz Allen Hamilton.

The move was designed to push the costs of the reinvestigations—which can be thousands of dollars each—to the next fiscal year, according to the Federal Times.

Top Secret level individuals must be reinvestigated every 5 years, and Secret level personnel every 10 years, according to DSS. Spokeswoman Cindy McGovern told Government Executive that the agency has oversight of investigations for contract personnel in the National Industrial Security Program, which includes contract employees at the Energy and Defense departments and the CIA.

According to Bloomberg Businessweek, more than 1.7 million Americans have or are eligible for a Top Secret clearance.


Pentagon Leaders Put Support Contractors on Notice for Deep Cuts

By Elaine M. Grossman

June 12, 2013


Defense Secretary Chuck Hagel and his top budget deputy on Tuesday signaled they intend to make deep cuts in contractor personnel who help manage programs in almost every sector of the Pentagon bureaucracy.

The Defense Department today employs an estimated 700,000 service contractors who, in many cases, work side-by-side with the civilian and military workforce at installations across the country and worldwide.

The new shift can be expected to return some clout into the hands of civil service employees who work at half the cost or even less, reversing a decades-old trend of farming out program management increasingly to pricey hired hands in the defense industry.

“We are currently reviewing all contractors, all the contracts we have,” Hagel testified at a Senate Appropriations Defense Subcommittee hearing.

To some extent, the secretary said, “we have no choice” about using contractors for functions that the Defense Department cannot perform itself.

“Contractors are part of any institution. We need them [for] certain skills, certain expertise,” Hagel said. “But there’s no question that we’re going to have to make some rather significant adjustments.”

Hagel told the panel he was recently briefed on the results of the Pentagon’s high-level “Strategic Choices and Management Review,” which he will continue to assess before making some fresh budget decisions.

Among the possible targets for cuts in coming years could be the modernization of nuclear platforms: A new Long-Range Strike bomber aircraft, replacements for today’s Ohio-class ballistic missile submarines, updated ICBMs or cruise missiles. Each of these efforts could also be affected by any move to reduce contractor support personnel.

The “skimmer” review — so named for its “SCMR” acronym — was to address how best to apportion $500 billion in congressionally mandated funding reductions over the next decade. If lawmakers repeal the 2011 Budget Control Act, lesser but nonetheless substantial cuts remain expected in 2014 and beyond.

Hagel made the remarks after Senator Dick Durbin (D-Ill.), the subcommittee chairman, said he is “concerned about the cost of the contractor workforce.”

“Recent reports have again emphasized that the average contract employee costs two to three times as much as the average DOD civilian employee for performing similar work,” Durbin said.

By way of example, the senator cited news reports that the self-proclaimed leaker of classified documents on government surveillance programs was a well-paid contractor working for the National Security Agency, despite what some critics see as thin credentials. The agency is a component of the Defense Department.

“Edward Snowden, who was an employee of Booz Allen, [was] working for one of our premier national security agencies as a contract employee,” Durbin said. “The story that’s told is that he was a high school dropout, that he didn’t finish his military obligation — though he attempted — and dropped out of community college. And it’s also reported that he’s being paid in the range of $200,000 a year as a contract employee.”

According to Pentagon data compiled three years ago, “contract employees comprised 22 percent of your department’s workforce but accounted for 50 percent of its cost, $254 billion,” he said.

Defense Comptroller Robert Hale, also testifying at the session, said the figures Durbin cited appeared to be accurate.

“But let me say, whether or not a contractor or a civilian is cheaper or better, it really depends on the circumstances,” Hale testified. “There are some cases where we simply don’t have the skills in the Department of Defense that we need, or it’s a short- term job, [and it] wouldn’t make any sense to grow them.”

In fact, he noted, the Pentagon still lacks an indigenous capability to perform financial audits on its own hundreds of billions of dollars in annual spending, despite intense criticism and promised remedies over dozens of years.

“I’m hiring a lot of contractors because they know how to do audits,” Hale said. “We don’t yet.”

Noting that the Defense Department has put most new hires on hold and civilians have not received pay raises since 2011, the chairman suggested it is time to consider whether investing more in the department’s own non-uniformed work force might be more cost-effective than using contractors.

“If we’re setting out to save money, has the civilian hiring freeze resulted in more or fewer contract employees?” Durbin asked. “And if so, how are you tracking the cost ramifications? Has contractor pay in the Department of Defense increased during the civilian hiring freeze?”

“I don’t disagree with any of your general analysis on contractors,” Hagel said, noting that Defense spending has significantly grown across the board over the past decade in which the U.S. military has been involved in two major conflicts.

“The money flowed into different departments and institutions, because we felt they were required for the national security of this country,” he said. However, he added, “there will come a time, and it is now, where we’re going to have to make some hard choices in the review of those.”

The Defense leaders promised to provide to Durbin additional data on how many contractor personnel support the department and their average salaries, when available.


Fear of thinking war machines may push U.S. to exascale

Congress readies a bill, but funding estimates are below other nations

Patrick Thibodeau

June 20, 2013 (Computerworld)


WASHINGTON — Unlike China and Europe, the U.S. has yet to adopt and fund an exascale development program, and concerns about what that means to U.S. security are growing darker and more dire.

China’s retaking of the global supercomputing crown was the starting point for discussion at an IBM-sponsored congressional forum this week on cognitive computing.

Cognitive computing systems have the capability of taking vast amounts of data and making what will be, for all intents, thoughtful decisions.

Efforts to draw attention to exascale in the U.S. House are being led Rep. Randy Hultgren (R-Ill.), who talked about China’s new 33.89-petaflop system, Tianhe-2.

“It’s important not to lose sight that the reality was that it was built by China’s National University of Defense Technology,” said Hultgren, who is finalizing a bill “that will push our nation toward exascale.”

Hultgren is introducing legislation, the American Supercomputing Leadership Act, to require the U.S. Department of Energy to develop a coordinated exascale research program. The bill doesn’t call for a specific spending level, but one source said about an annual appropriation of $200 million, if not more, will be sought.

That amount of money is well short of what’s needed to build an exascale system, or a computer of 1,000 thousand petaflops. Each petaflop represents one thousand trillion floating point operations per second.

Earl Joseph, an HPC analyst at IDC, said that “$200 million is better than nothing, but compared to China and Europe it’s at least 10 times too low.”

Joseph said that it’s his guess that the world will see an exascale system by 2015 or 2016 “installed outside the U.S. It will take a lot of power and it will be large, but it will provide a major capability.”

Lawmakers, at a recent hearing, were told by HPC researchers that the U.S. needs to spend at least $400 million annually to achieve exascale capabilities in a reasonable time, possibly by end of this decade.


If the U.S. falls behind in HPC, the consequences will be “in a word, devastating,” Selmer Bringsford, chair of the Department. of Cognitive Science at Rensselaer Polytechnic Institute, said at the forum. “If we were to lose our capacity to build preeminently smart machines, that would be a very dark situation, because machines can serve as weapons.

“When it comes to intelligent software, the U.S. is preeminent and we simply cannot lose that because the repercussions in the future, defense-wise, would be very bad,” said Bringsford.

The risk is not just in the technology, but in the people as well. The U.S. abandoned its efforts to develop a super collider in the 1990s. Europe built the Large Hadron Collider near Geneva, and consequently this research facility draws physicists from around the world.

U.S. Rep. Chaka Fattah (D-Penn.) told of meeting with post-doctoral physicists doing his work in at the European facility. There was once a time when that same work was done in the U.S., said Fattah.

“We can’t afford to retreat as a nation in investment in big science,” said Fattah, “and there is no more important investment than high performance computing.”

Joseph makes a similar point. As exascale capability arrives outside the U.S., he said, “we will likely start to see top researchers around the world either move to, or spend a lot of their time at these exascale sites.”

The emergence of big data, the ability to take the sum total of something and not just a sample, will only be better enabled by exascale systems.

David McQueeney, vice president of IBM research, told lawmakers that HPC systems now have the ability to not only deal with large data sets but “to draw insights out of them.” The new generation of machines are being programmed to understand what the data sources are telling them, he said.

“So instead of having to predetermine what the function of that machine is, you actually built a machine whose intention is to learn,” said McQueeney.


New Obama climate plan may draw from panel’s recommendations

Thu, Jun 20 2013

By Valerie Volcovici


WASHINGTON (Reuters) – Clues about what to expect in a White House package of climate measures expected within weeks might be found in a report given to the president in March by a blue-chip team of scientists and business leaders.

The President’s Council of Advisors on Science and Technology (PCAST), led by President Barack Obama’s chief science adviser John Holdren, listed six major components that should be central to the administration’s second-term climate change strategy.

The document outlined a mix of measures that different federal agencies could take on, such as power plant emissions standards by the Environmental Protection Agency and more research and development for carbon capture by the Department of Energy.

“We gave him a series of suggestions. It is his internal team – led by Holdren – that is looking at those and other ideas as they develop the strategy,” said PCAST member Daniel Schrag, director of the Harvard University Center for the Environment.

Heather Zichal, Obama’s energy and climate policy adviser, confirmed on Wednesday that an announcement on climate change measures is coming within weeks, and gave the most details so far of what might be in, and out, of the package.

The administration plans to feature policies that don’t require congressional action or extra funding, such as moves to expand energy efficiency standards for appliances, accelerate clean energy development on public lands and use the Clean Air Act to tackle greenhouse gas emissions, Zichal said.

Soon after his re-election in November, Obama tasked PCAST, whose members include Google executive chairman Eric Schmidt, with making recommendations on a climate change agenda that he could roll out after inauguration.

Ernest Moniz, then a professor at Massachusetts Institute of Technology and now U.S. Energy Secretary, was a member at the time the report was put together but has stepped down.

Schrag said the group rushed to outline recommendations by March. The plan was released by the White House on March 22 as a nine-page report and blog post.

While he does not know which of the group’s recommendations Obama will ultimately use, Schrag said PCAST focused on politically feasible measures.

The main recommendations included climate change preparedness; shifting away from coal use and regulating power plants; providing incentives for clean energy and energy efficiency; continued research and development; international engagement; and conducting energy reviews every four years.

“These changes are not particularly expensive. The opportunities and issues such as using energy more efficiently which we recommend can only benefit the economy,” said PCAST member Mario Molina, a Nobel Prize-winning professor of chemistry at the University of California at San Diego.

The group also recognized that shale gas production should play a major role in reducing carbon emissions in the short and medium term, but said regulations should be developed in some cases to reduce methane leakage and ensure water safety.

Among the measures it suggested to give incentives to clean energy user were extending tax credits now available to wind farms to other forms of renewable energy, and lengthening the time between renewals from two years to five or 10.

PCAST said that the White House should also engage in international cooperation on climate and energy issues. That suggestion bore fruit this month with the agreement between the United States and China to reduce hydrofluorocarbons, a potent greenhouse gas.

The recommendations also gave equal weight to measures aimed at helping the nation adapt to the impacts of climate change.

Their suggestions include developing plans to update U.S. infrastructure to make transportation, energy and water delivery systems more resilient to extreme weather, and to create a central commission to oversee national preparedness efforts.

“We talked to the president a variety of times about climate change and it was clear that there has been a crystallization in the administration’s mind that we are needing to be prepared to cope with climate impacts,” said Rosina Bierbaum, Dean of the University of Michigan School of Natural Resources and Environment.

The PCAST recommendations can be found here: here


(Editing by Ros Krasny and Eric Walsh)


Pentagon to Wireless Industry: We Need Our Spectrum


By Bob Brewin

June 19, 2013


The Defense Department must retain exclusive access to some spectrum in order to support military training and other critical national security needs, Pentagon Chief Information Officer Teri Takai told participants in a wireless forum sponsored by the Washington Post yesterday.

On June 14, President Obama detailed plans for governmentwide spectrum sharing with commercial carriers, an effort he views as an economic imperative.

In a memo titled “Expanding America’s Leadership in Wireless Innovation,” Obama called on the National Telecommunications and Information Administration, which manages federal spectrum, to “identify opportunities for agencies to relinquish or share spectrum” in a wide swath of frequencies currently used by Defense and civil agencies to support the operation of key mission critical systems.

“Expanding the availability of spectrum for innovative and flexible commercial uses, including for broadband services, will further promote our nation’s economic development by providing citizens and businesses with greater speed and availability of coverage, encourage further development of cutting-edge wireless technologies, applications, and services, and help reduce usage charges for households and businesses,” Obama said in the memo.

But in her comments Tuesday, Takai warned there needs to be a balance between supporting the growth of the wireless industry and Defense missions. “We have to weigh not only our responsibility to the nation, but also our operational responsibility,” she said.

While she didn’t identify specific frequency bands considered critical, she said, “I think it’s important from a national security standpoint to recognize that we have a certain amount of spectrum that we utilize which is exclusive to us from a national security and an interference perspective.’

Obama identified three bands used by Defense for operation of radars, battlefield communications and command and control systems as candidates for sharing or outright auction: 1755-1850 MHz band, 5350-5470MHz and 5850-5925 MHz bands.

Among other things, Takai said Defense needs spectrum to support training in the United States, noting that 80 percent of training occurs around domestic military bases. “The safety of our men and women overseas is really based on their . . . ability to train,” she said.

She suggested spectrum could be shared geographically based on population density or time. To do that, she said, requires knowing who owns the spectrum, and when and where they’re using it.

“There’s certainly opportunity for us to do spectrum-sharing in, for example, rural areas, where we don’t have the bases,” Takai said. “Unfortunately, those aren’t the areas where there’s the commercial demand.”

Retired Air Force Brig. Gen. Bernie Skoch, now a defense industry consultant, said the public is increasingly dependent on wireless devices and services and “few of us want to be or are ‘off the grid’ in RF spectrum anymore.”

“Sharing arrangements only work when they are scrupulously [designed and managed]. No one likes learning, exactly when they need spectrum, that someone else needs it at the same time and in the same area,” he said.

‘That is exactly what happens in disasters: Everyone wants it there and then. And no one likes seeing their UAV crash because someone didn’t anticipate that at [a given moment] their remote vehicle will experience interference from a user 300 miles away,” he said.

“In my view, one could provide infinite spectrum to the public and it wouldn’t be enough,” Skoch warned. And once Defense spectrum is repurposed to a public use, the department will have a hard time getting it back, he said.

Sequester sparks laid-off workers’ suit


By: Austin Wright

June 20, 2013 05:04 AM EDT


Defense contractor L-3 Communications faces a class action lawsuit over its handling of layoffs at an Army airfield in Georgia, sparking renewed anger in Congress over an issue that first flared up during last year’s presidential campaign.

Three former L-3 employees are suing the company, charging they were laid off without the 60 days of notice required under the federal WARN Act. It’s a first test for the 1988 law since the Obama administration alerted contractors last year the federal government would pick up their legal costs if they were sued for WARN Act violations as a result of contracts modified or canceled under sequestration.

Already, the case is drawing the ire of the powerful Republican chairman of the House Armed Services Committee, who’s still seething after being outmaneuvered on the issue last year. “During the presidential campaign, the Obama White House foresaw sequestration triggering massive layoffs across the country,” Rep. Buck McKeon said in a statement to POLITICO.

“Affected industries were bound by law to notify their workers ahead of sequestration layoffs,” McKeon continued. “Instead, in the most cynical of moves, the White House offered taxpayer money to companies to stop them from issuing layoff notices, warnings due days before the election.”

The California congressman, of course, had political motivations of his own: At the time, Lockheed Martin and other defense contractors were threatening to send pre-election WARN notices to tens of thousands of workers across the country — putting their entire workforces on alert even though only a fraction were ultimately at risk of being laid off.

The problem, they said, was that they didn’t know which fraction.

On Capitol Hill, Republicans were counting on the blanket notices to reinforce the GOP narrative that the economy was sputtering under Obama in the final weeks before Election Day.

But contractors backed off after the White House Office of Management and Budget issued its memo, saying legal costs associated with the WARN Act could be passed on to the government — as long as the layoffs were the result of sequestration and companies followed other aspects of the law.

Throughout the squabbling, New York-headquartered L-3 stayed quiet — and had little to say about the lawsuit filed against it last week. “We haven’t yet seen the suit and do not comment on matters of litigation,” said spokeswoman Jennifer Barton.

The three former employees sued for themselves and other former employees facing similar circumstances. The three worked for L-3 as part of a contract to provide maintenance and support to Hunter Army Airfield, which serves Fort Stewart in Georgia and has seen its budget slashed as a result of this year’s fiscal uncertainty.

“Aviation maintenance services, to include the efforts being performed at Hunter Army Airfield, Ga., have not been isolated from the budget situation being experienced by the Army,” said Dan O’Boyle, a spokesman for the Army’s Redstone Arsenal. “As a result, there has been less work required from all contractors providing aviation maintenance services for the Aviation and Missile Command’s Logistic Center.”

One of the three former employees named in the lawsuit specifically asked a co-worker who handled human resources about the WARN Act, according to the complaint. The employee, Teofilo Mariano Jaquez, got a letter from L-3 on April 10 terminating his employment, citing the government’s intent to “eliminate positions,” the complaint says.

He asked why he wasn’t provided 60 days of notice, saying he was entitled to such notice under the WARN Act. “I don’t know what that is,” he was told, according to the complaint. “You should ask corporate.”

The other two former employees named in the suit got letters on May 3 notifying them of the possibility of layoffs because of an “unforeseen business circumstance.” Within two weeks, both of them received letters terminating their employment, effective immediately, according to the complaint.

The three are seeking wages and other compensation they would have earned during their 60-day notice period.

The WARN Act, which applies to companies with more than 100 employees, allows for exceptions due to unforeseeable business circumstances — a condition that isn’t thoroughly defined in the law.

It also remains to be seen whether the government would be on the hook for L-3’s legal costs, in accordance with OMB’s memo.

“The WARN Act is a law intended to help workers and skilled laborers,” McKeon said. “Offering companies taxpayer money to ignore that law is disappointing and wasteful.”



Can the cloud provide the best strategy for security?

Posted by William Jackson on Jun 20, 2013 at 6:02 AM

Security is evolving from a do-it-yourself operation — loading software on a device or attaching a box to a network — to managed, hosted services leveraging the anytime/anywhere scalability of the cloud for large-scale analytics that were not practical before.

No one yet is seriously suggesting getting rid of firewalls and antivirus detection, but it has been painfully obvious for some time that by themselves, they are not adequate protection. Intelligence-based security is being touted as the way to counter more complex attacks against high-value targets, and the emergence of cloud computing now offers a way to gather enough intelligence and analyze big data fast enough to effectively spot malicious activity.

“We do not look for malware, we do not look for exploits,” said Dmitri Alperovitch, CTO of CrowdStrike, which has announced its first cloud-based security offering. “We look at what is being done, rather than how.”

The CrowdStrike Falcon Platform is one of the latest in a growing number of services offering security from the cloud, rather than security for the cloud. Another recent announcement in this field is the integration of global attack data into Risk I/O’s cloud-based platform, which uses big data and predictive analytics to help prioritize vulnerability data. Other companies with cloud-based security services include the Appthority, Check Point, Fortinet, Okta, Symantec, Veracode and Zscaler.

Moving security out of the box and even out of the enterprise can help to address a new generation of adversaries using layered attacks to methodically find weaknesses, penetrate systems, escalate privileges and then quietly observe and export data. Intelligence is needed not just to detect these attacks, but to respond to them.

In the past, knowing who you were up against wasn’t necessary to security. You spotted the attack, and you blocked it. But, “if you are being targeted by a determined adversary, they are not going to stop because you block them,” Alperovitch said. “They are going to keep it up until they get in. They can spend years at it.”

CrowdStrike’s approach to active defense has a decidedly military and intelligence flavor. It takes a strategic view with an emphasis on knowing your enemy, not just the weapon. Most of the more than 4,000 organizations tracked for its Adversary Intelligence database are nation-sponsored. Its goal is not to stop every malicious attempt.

“You can’t block every attack,” Alperovitch said. “And sometimes blocking is not the best strategy.” If you spot and identify someone engaged in spying or espionage, the best strategy might be to string him along and watch him, “to better understand his tradecraft.”

The goal is to raise the bar for attackers, making their craft more difficult and expensive. This can mitigate one of the great advantages attackers have; it is dramatically cheaper to launch an attack than it is to defend against it, resulting in a very high return on investment for successful attacks. Recognizing sophisticated techniques “doesn’t eliminate all activity, but it dramatically raises the cost of intrusion,” Alperovitch said.

It is too early to say what impact the cloud and big data analytics will have on security, and it’s a pretty safe bet that it won’t solve every problem. But it is an attractive option for concentrating resources where they are most needed.


Scientists create high-resolution 3-D atlas of human brain

Washington Post

By Meeri Kim, Published: June 20

A 65-year-old woman’s brain was cut into 7,400 slices to create the most detailed three-dimensional atlas of the human brain ever made, bringing researchers one step closer to reverse-engineering the brain’s convoluted circuitry.

Brain atlases are essential reference tools for researchers and physicians, to determine which areas are “lighting up” during a task or thought process, or during image-guided surgery. The better the atlas resolution, the better doctors can target ever-smaller parts of the brain and their individual function.


The atlas creators, who are from Canada and Germany, have made the ultrahigh-resolution model — 50 times more detailed than a typical scan — publicly available in a free online format. The authors also published their work in the journal Science on Thursday.

The atlas, called BigBrain, offers a common basis for open, worldwide scientific discussion on the brain, said author Karl Zilles of the Heinrich Heine University Düsseldorf.

Zilles pointed to a novel treatment for Parkinson’s disease called deep brain stimulation, where electrical impulses are sent through electrodes implanted into specific points in the brain. He said BigBrain may open the doors for more accurate localization of electrode placement and thus render treatment more effective.

After staining and digitizing the thousands of plastic-wrap-like slices, the nearly cellular resolution map revealed the network of layers, fibers and microcircuits of the woman’s brain.

While variation exists among brains, across ages and individuals, they have largely the same distribution of brain structures and anatomy, said author Alan Evans of McGill University’s Montreal Neurological Institute. There are “subtle shape changes among individuals,” but all atlases start from one representative brain and go from there.

The team was chiefly limited by computing power and capacity. To map the human brain with 1 micron spatial resolution, which has been done for mouse brains, the atlas would take up 21,000 terabytes of data — essentially rendering it impossible to navigate. By comparison, BigBrain, with its 20 micron resolution, comprises about a terabyte of data. Prior MRI-based atlases had resolution of 1 millimeter.

Richard Leigh, a Johns Hopkins neurologist, said he’s looking forward to test-driving BigBrain for his research on stroke recovery. With the microscopic detail available, Leigh can see which particular groups of neurons are growing through stroke treatment rather than just a general fuzzy area.

Evans was in Seattle on Wednesday working with the Allen Institute for Brain Science. Created by Microsoft co-founder Paul Allen, who has committed $500 million since its start in 2003, the Allen Institute has assembled a less-detailed human brain atlas of its own.

BigBrain is part of the European Union’s Human Brain Project that brings together specialists in neuroscience, medicine and computing to decipher the mysteries of the brain.

President Obama announced in April an initiative to map the human brain, describing it as a way to discover cures for neurological disease and strengthen the economy.



Contractor that handles public’s Medicare queries will do same for Affordable Care Act

Washington Post

By Susan Jaffe, Published: June 20


Within days, the company that handles a daily average of more than 60,000 calls about Medicare will be deluged by new inquiries about health insurance under the Affordable Care Act.

The six Medicare call centers run by Vangent, a company based in Arlington County, will answer questions about the health-care law from the 34 states that opted out of running their own online health insurance marketplaces or decided to operate them jointly with the federal government.


The Department of Health and Human Services estimates that Vangent’s call centers will receive 42 million calls about the federal marketplaces this year, a daily average of up to 200,000; plus answer 2,400 letters and 740 e-mails, and host 500 Web chats daily. Customer service representatives will take consumers through the process — from shopping for a plan to enrolling.

Running the 800-Medicare call centers may provide valuable experience, but Vangent’s track record reveals that it was slow to adapt when changes in the Medicare program caused dramatic spikes in demand.

“It’s going to be huge,” said Bonnie Burns, a training and policy specialist at California Health Advocates. “The number of calls they are likely to get will probably dwarf anything they saw in Medicare.”

Vangent declined requests for interviews.

The company will begin its new operation by the end of June.

The marketplaces, also called exchanges, will offer private health policies for individuals and small businesses, and government subsidies for many people for coverage beginning next year. Enrollment starts Oct. 1.

Vangent, a subsidiary of General Dynamics Information Technology, will run both Medicare and the federal health exchange call centers under a contract worth $530 million in its first year. The company, which was awarded the HHS contract in April, eventually may open as many as eight additional call centers and triple its staff to more than 13,000, government officials said.

Explaining plan options and subsidy eligibility won’t be easy. Three years after President Obama signed the landmark health legislation into law, nearly half of Americans still know little about how it affects them, according to a recent Kaiser Family Foundation poll. (Kaiser Health News is an editorially independent project of the foundation.)

Shortly after the prescription drug benefit was introduced in 2006, the HHS inspector general found that 44 percent of callers reported problems accessing information — despite a similar 2004 finding — and 21 percent of callers waited so long for responses that they hung up. When Medicare introduced a new prescription drug discount card in 2004, the Government Accountability Office reported that 29 percent of the call-center answers were inaccurate and 10 percent of the calls were disconnected.

HHS officials said that the company’s performance in the Medicare call centers has improved and noted that the average time to answer calls is less than three minutes, down from more than eight minutes in 2007.

“We’re building on our past experiences to ensure enrollees in the health insurance marketplace are able to accurately receive information,” HHS spokeswoman Joanne Peters said.

Some advocates for Medicare beneficiaries say that some problems persist at the Medicare call centers, despite improvements.

Fred Riccardi, program director at the Medicare Rights Center, a patient advocacy group based in New York, said that some beneficiaries who call the group’s free national help line reported that 800-Medicare representatives neglected to explain the option to appeal when benefits are denied, or the prescription limits of some drug plans.

Still, Burns said Vangent’s past work could be an advantage. “They’ve had successes and failures over the years and have some sense of what they’re dealing with,” she said.

An HHS official said call-center representatives would “undergo extensive training” about the health law and basic insurance issues but could not provide more details. The official also said that HHS would evaluate Vangent’s performance through a number of factors, including customer satisfaction.

Neither Vangent nor HHS would provide specific requirements of the contract or a copy. Employment ads for the call centers’ “temporary customer service representatives” seek applicants who have a high school diploma or equivalent and six months of telemarketing or secretarial experience.

An HHS spokeswoman said that customer service representatives will answer questions by reading from HHS-approved scripts and provide state-specific information. However, she would not provide examples of the scripts or say whether they were tested with consumers.

Neither Vangent nor HHS has announced that the company is getting help. But last month, the president of Maximus, a company that processes Medicare appeals, told investors that his firm received a government contract to operate the call centers with Vangent. Maximus officials declined to comment on that announcement or provide a copy of the contract.

“It’s going to take a lot of tender loving care to explain to folks how to handle this, how to make the right choices, how to participate,” said Richard Montoni, president of Maximus. “With the passage of time, those additional speed bumps will be ironed out and we will be out there running these programs as intended.”

— Kaiser Health News

Kaiser Health News is an editorially independent program of the Henry J. Kaiser Family Foundation, a nonprofit, nonpartisan health policy research and communication organization not affiliated with Kaiser Permanente.



Apple antitrust trial ends; ruling could be broad

Washington Post

By Cecilia Kang, Published: June 20


NEW YORK — The Justice Department on Thursday concluded its antitrust trial against Apple over alleged price-fixing of digital books, with a federal prosecutor saying the creator of the iPhone and iPad engaged in an “old-fashioned, straightforward” conspiracy and Apple’s lead attorney warning that a ruling against the company “will send shudders throughout the business community.”

U.S. District Judge Denise Cote’s ruling in the Justice Department’s first major antitrust trial in more than a decade is expected to have broad consequences for the Internet economy. The Justice Department’s action against Apple, which it calls the “ringleader” of a cartel with book publishers to raise e-book prices, shows the Obama administration’s desire to create clear rules for powerful Silicon Valley firms that have been largely unbridled of regulations.

On Thursday, for instance, the Federal Trade Commission said it would launch a probe into patent-holding companies that may use lawsuits to unfairly edge out competition, a problem particularly seen in the high-tech industry.

The suit against Apple and five publishers focused on the book-publishing business, which is undergoing a dramatic transformation as consumers increasingly turn to e-reading devices. And the business deals struck between Apple and publishers are also echoed across many industries as Internet companies race to provide videos, radio and other media offerings over the Web.

“All across the digital economy, we see companies trying to dampen competition with business practices that were clearly recognized as illegal in the physical economy,” said Mark Cooper, a director at the nonprofit Consumer Federation of America.

All five publishers in the trial — Penguin, HarperCollins, Simon & Schuster, Hachette and Macmillan — have settled.

“When an antitrust case goes to trial, especially after every party but one has settled in the U.S. and Europe, the outcome is strongly precedential,” Cooper said.

Cote did not comment about her thinking during closing arguments for the three-week trial at U.S. District Court in Manhattan. The court’s judges typically take two months to arrive at a final decision after trial.

With such weight on the case, Apple’s attorney, Orin Snyder of Gibson Dunn, vigorously argued to the judge that the company’s deals with publishers were legal. And Apple portrayed itself as the underdog, trying to jump into a market dominated by Amazon.

“The government is taking perfectly sensible business agreements to infer sinister conduct,” Snyder said in closing remarks. “If Apple is found liable . . . that precedent will send shudders throughout the business community.”

Apple has promised a dogged legal battle and is prepared to appeal if Cote rules against it. Some experts say the case could reach the Supreme Court.

At the heart of the lawsuit are two allegations by the government: Apple wanted to lift prices for e-books, and it facilitated a conspiracy with publishing houses to achieve that goal. “Apple directed and oversaw a conspiracy to raise e-book prices and prevent low-price competition,” Justice Department attorney Mark Ryan said in his closing remarks.

Some experts wonder whether Apple’s determination is worthwhile. The company could face financial damages. The publishers that settled with the government agreed to $122 million in damages to states that had joined the federal suit. Cote has wide latitude to restrict future business practices. After Microsoft settled its antitrust cast with the Justice Department a dozen years ago, it had to open its campus to auditors, who were stationed in its Redmond, Wash., offices to ensure proper remedies were carried out.

The trial revealed much about the highly secretive company. Testimony by Apple executives and e-mails by Steve Jobs, Apple’s late co-founder and CEO, show the business tactics the company pursued to break into a market dominated by Amazon.

At first, Jobs didn’t want to sell digital books through the iTunes bookstore, according to two days of testimony by Eddy Cue, senior vice president of Internet software and services. But convinced that the iPad would create a rival platform to Amazon’s Kindle, Jobs became deeply involved in the creation of iBooks, from the way pages turn to the look of the iBooks retail platform, Cue said.

The Justice Department called Cue the “chief ringleader of the conspiracy.” Prosecutors presented evidence of Cue racing to clinch deals with the biggest publishing houses in six weeks between December 2009 and the launch of the iPad on Jan. 27, 2010. According to the evidence, Cue made more than 100 calls to top publishing executives to coordinate a change in how e-books were priced. He flew to New York and sent numerous e-mails to Penguin, HarperCollins, Hachette, Simon & Schuster and Macmillan.

Jobs was intimately involved in the formation of iBooks, and Cue kept him closely informed throughout his negotiations.

“Steve was nearing the end of life when we launched the iPad. . . . I wanted to be able to get that done in time for that because it was important to him,” Cue said, in a rare personal note.

To achieve that goal,Evidence presented during the trial showed that Apple went along with frustrated publishers who wanted to change the way the digital books industry functioned. The book publishers wanted to have more control over prices offered to consumers and proposed an “agency model” in which they could determine retail prices and Apple got a 30 percent cut of sales. At the time, Amazon had about a 90 percent share of digital books and set prices at $9.99.

Key to Apple’s agreement to the new model — and at the center of the Justice Departments’s case — were so-called Most Favored Nation clauses that ensured Apple would get matching rates offered to rivals. Justice Department attorneys argued that the MFN clause in e-books contracts kept prices artificially high because Apple’s rivals weren’t motivated to drop prices below the $12.99 to $14.99 range targeted by iBooks and publishers.

Apple’s Cue agreed that some prices went up, particularly for best sellers and new releases, which publishers were holding back from Amazon because of its price cap.

“Wow, we have really lit the fuse on a powder keg,” Jobs wrote in an e-mail to Cue, dated Jan. 30, 2010.

But Apple has argued that average e-book prices have gone down. Some economists brush off that notion, saying the most desired books have increased in price.

In his closing arguments, Ryan urged the judge to focus on evidence that he argued demonstrated a conspiracy. “The issue in this case is collusion,” he said. “We have no quarrels with the agency model. We have no quarrels in general with MFNs.”



U.S. probing contractor that vetted NSA leaker

Washington Post

By Thomas Heath, Published: June 20


The Falls Church-based government contracting firm that performed a background investigation into Edward Snowden before he disclosed details of a secret federal surveillance program is under criminal investigation by the Office of Personnel Management, according to Sen. Claire McCaskill (D-Mo.).

McCaskill said Thursday at a Senate hearing that the investigation into USIS, whose original name was US Investigations Services, is based on the “systemic failure to adequately conduct investigations under its contract” with the federal government.

In a statement, USIS said it “has never been informed that it is under criminal investigation.” The company said it received a subpoena for records from OPM’s inspector general in January 2012 and fully cooperated with “the government’s civil investigative efforts.”

The company declined to confirm whether it had screened Snowden, saying the thousands of investigations it conducts for the government are confidential.

Snowden was working for another Virginia government contractor, Booz Allen Hamilton, when he disclosed details of National Security Agency surveillance of U.S. citizens. The case has thrown a spotlight on the government’s reliance on contractors for sensitive tasks. In 2011, more than 4.2 million government and contract workers had security clearances, and more than a third of them had top-secret access.

McCaskill spokesman Drew Pusateri said McCaskill was told of the investigation by OPM’s inspector general.

Patrick McFarland, the inspector general for OPM, told the committee that there are concerns about USIS’s background check into Snowden. “Yes, we do believe that there — there may be some problems,” McFarland said at the hearing, according to the Reuters news agency.

According to its Web site, privately held USIS has 100 contracts to provide background checks for more than 95 federal agencies. The company was established in July 1996 “as a result of the privatization of the investigative branch of the Office of Personnel Management,” according to the Web site.

The company’s has more than 6,700 employees, and its clients include the departments of Justice, State, Homeland Security and Defense, plus about a dozen intelligence agencies, including the NSA and the National Reconnaissance Office. USIS is part of Altegrity, which is in turn owned by Providence Equity Partners.



Budget cuts hit security checks for defense contractors


Wed, Jun 19 2013

By Mark Hosenball


WASHINGTON (Reuters) – A budget shortfall has forced a Pentagon security unit to sharply cut back on regular investigations used to update security clearances for defense contractor employees.

In a little-noticed announcement posted on its website on June 7, the Defense Security Service said that “due to a funding shortfall,” it has been obliged to suspend “most” routine re-investigations of defense contractor employers cleared at the “Top Secret” level, at least through the end of September.

The announcement came two days before Edward Snowden went public in a video released by Britain’s Guardian newspaper as the source of leaks about the U.S. government’s top-secret surveillance of phone and Internet activity.

Snowden was a contractor for Booz Allen Hamilton working as a systems administrator at a National Security Agency facility in Hawaii.

The leaks have alarmed the U.S. intelligence community and have raised questions about whether the government is doing enough to vet individuals for security clearances.

A Senate Homeland Security subcommittee has scheduled a hearing for Thursday to examine the security clearance process, at which Defense Security Service Director Stan Sims and other U.S. officials are scheduled to testify.

A person familiar with the matter said that because Snowden worked as a government employee and contractor for the NSA and Central Intelligence Agency, his security clearances would have been handled by them, rather than the Defense Security Service.

But the Defense Security Service announcement shows how the government has been forced to recently scale back its oversight of security clearances in general for contractors.

A government source familiar with the matter said the policy change was related to automatic federal spending cuts known as sequestration that began earlier this year when Congress failed to agree on an alternative deficit reduction plan.

If a scheduled reinvestigation is delayed or canceled due to the budget crunch, clearance holders will be allowed to hold their clearances, officials said. “Nothing will happen to them,” said Cindy McGovern, a spokeswoman for the Defense Security Service.
The security service is a Defense Department agency responsible for authorizing, and then paying for, background investigations which the government conducts to determine defense contractors’ eligibility for security clearances.

The agency itself does not investigate contractor personnel, but commissions investigations from the Office of Personnel Management. That agency then employs its own contractors to conduct investigations in the field.

Under government rules, both government and contractor employees with clearances at the Top Secret level are supposed to be re-investigated every five years to ensure that problems have not cropped up in their finances, backgrounds or behavior.

It is these routine re-investigations for Top Secret clearance holders that the Defense Security Service says it is being forced to suspend.

McGovern said that, in general, budget restrictions will force personnel at the agency to cut back their working weeks, and hence their pay, to four days per week, starting in July.

She declined to comment on the case of Snowden, who is believed to be hiding in Hong Kong and is under criminal investigation for the leaks. Snowden had a clearance for “Top Secret/Sensitive Compartmented Information.”

The Defense Security Service’s announcement said that despite the budget crunch, five-year re-investigations will continue for “key management” contractor personnel as well as people needing access to “mission essential” information “directly supporting the Intelligence Community.”



Here’s How Edward Snowden Got ‘Top Secret’ Clearance

By DAVID FRANCIS, The Fiscal Times June 21, 2013

A Senate Homeland Security Subcommittee met Thursday afternoon to examine the government’s process for granting security clearance.

The purpose of the meeting was to figure out how someone like Edward Snowden, the NSA leaker, could get access to some of the most secret information in the country.

The subcommittee failed in that regard: Merton Miller, associate director of investigations at the Office of Personnel, said he had no information on Snowden’s specific case. OPM Inspector General Patrick McFarland said he did have information on Snowden, but couldn’t reveal it to the committee just yet.

That’s not to say that the committee lacked revelation. Six witnesses and three lawmakers revealed a security clearance system so broken that it would be comical if a 29-year-old wasn’t hiding in Hong Kong and leaking American secrets to the press.


They include:

•87 percent of background checks are never fully completed. OPM uses the information it has to make a judgment on whether to approve these checks.

•There are no uniform guidelines across the government for different levels of clearance. This means that top-secret clearance at one agency means something completely different at another.

•Within each agency, there are no strict guidelines for determining security clearance.

•USIS, a private contractor, conducts 65 percent of all U.S. government background checks.

•USIS, which conducted a background check on Snowden, is now under investigation by OPM’s IG for failing to conduct proper background checks.

•OPM has already paid USIS $200 million this year.

•The $1-billion-dollar fund that OPM uses to pay for background checks has never been audited.

•OPM’s IG said they have not been granted access to documentation on the fund.

•Miller said the documentation did not exist.

•Even if it did exist, OPM’s IG said he didn’t have the staff to audit the fund.

•OPM’s IG was unable to answer the first two questions he was asked without extensive consultation with members of the audience.

•One question was passed from one witness, then to another, who called someone named Stanley Sims out of the audience to answer it.

•I didn’t catch Sims’ title, but he did say there are more than 10,000 private facilities in the United States that have security clearance.

•Eighteen OPM investigators have been convicted of falsifying information contained in investigations they’ve conducted. Eleven work for OPM, while the other seven work for private contractors.

•Forty other investigators are currently being investigated for falsifying background checks.

•When asked if there are more than 40, IG McFarland said, “I believe there may be considerably more. I don’t believe we’ve caught it all by any stretch.”

•Sen. Claire McCaskill (D-MO) asked Miller why OPM so heavily relied on contractors.

•He answered because they were cheaper than hiring government workers.

•She asked him for a cost-benefit analysis proving this.

•He said there is no cost benefit analysis.

•McCaskill again asked how he knew they were cheaper.

•Because they are cheaper, Miller said.

•”I’m tired of this assumption that contractors are cheaper. I just think it’s easier,” McCaskill then said.


With that, Thursday’s matinee of the absurd lowered its curtain.

Little was revealed about Snowden. But the hearing did prove what McCaskill said in her opening statement: OPM is a “government agency where there is rampant fraud, limited accountability, and no respect for taxpayer dollars.” It also revealed how easy it was for Snowden to get access to the nation’s most sacred secrets.

“This situation we have with Snowden,” Sen. Jon Tester (D-MO) said, “we shouldn’t be surprised at all.”



DP&L’s owner to build $20M backup power supply here

Dayton Daily News

Posted: 12:05 a.m. Wednesday, June 19, 2013

By Thomas Gnau – Staff Writer

Moraine —

A subsidiary of the company that owns the Dayton Power & Light Co. said it will spend $20 million in the Dayton area to build a backup power supply at a local generating station. The goal is to stabilize electric power to the area and beyond.

AES Energy Storage, a subsidiary of AES Corp., said Tuesday it will build the 40 megawatt energy storage facility at Dayton Power & Light’s Tait generating station at Arbor and Carillon boulevards in Moraine. The facility will be connected to the PJM Interconnection, the regional power grid for 60 million people in the Northeast and Midwest, AES said.

The heart of the project is an array of 800,000 D-size batteries. The lithium-ion batteries are similar in size to the batteries that power household flashlighsts or radios, but have a different construction.

It’s the first battery storage site in Ohio, AES said.

The project will bring AES’ energy storage capacity in PJM to more than 100 megawatts, the company said. A megawatt equals 1 million watts and can power about 1,000 average American homes for a year.

Chris Shelton, president of Arlington, Va.-based AES Energy Storage, said the investment represents about $500,000 per megawatt for this type of project, or about $20 million total.

The Tait station facility will be one of six such AES Energy Storage sites in the United States. The company also has a similar site in Chile.

The battery arrays are considered generators because of the service they provide, Shelton said. The site will remain connected to grid and respond to signals from PJM. It will be able to respond immediately to provide stability to power generation in Ohio and the Mid-Atlantic region, a service called “regulation,” he said.


And it’s not just for the high demands of summer. “It’s actually important every hour of every day, 24 hours a day,” Shelton said.

Tait Station was chosen for the facility because it has room for it and AES Energy Storage typically likes to work at existing sites, Shelton said. The station has interconnection capacity and is close to existing DPL operations.

Construction on the array started three weeks ago. The array will be up and running in September, Shelton said.

The array’s small, sealed lithium ion batteries are derived from batteries that often serve hybrid vehicles, such as buses, but these batteries will be new, Shelton said. He called the array a “high-performing resource,” with no emissions, no water usage and no fuel use.

The site may result in a permanent new job at the station, depending on demand and work load, Shelton said. There will be about 15 construction jobs for four months there as the array is built.

Tait Station consists of seven gas-and/or oil-fired combustion turbines, four diesel generators with a summer generating capacity of 586 megawatts, according to DPL.

Phil Herrington, DP&L president and chief executive, said in a statement: “Having served within PJM for many years, we are pleased to expand AES’ standing in the market with energy storage assets across West Virginia, Pennsylvania and now Ohio. We look forward to working with the city of Moraine on this project.”

Tait Station’s generating turbines are used during times of peak electricity demand, especially in the summer. The new project will connect through the station but will have an independent agreement with PJM, AES Energy Storage said.

In early May, AES said in a filing with the U.S. Securities and Exchange Commission that it plans to close the O.H. Hutchings Generating Station in Miamisburg at the end of May. This new construction is unrelated to that, Shelton said. He declined further comment on those plans.

AES Corp. acquired DPL Inc. in 2011.


Hagel discusses ‘State of DOD’ in Nebraska speech

by Karen Parrish

American Forces Press Service


6/20/2013 – WASHINGTON (AFNS) — In a wide-ranging speech given today at the University of Nebraska at Omaha, his alma mater, Defense Secretary Chuck Hagel talked about the necessity of adapting the nation’s military to operate in a world that’s undergoing far-reaching geopolitical, technological and economic change.

“The world is changing, and America’s national security structure — including our military — must change with it,” he said. “How America responds to the challenges of this new world will direct our future.”

Even as U.S. forces transition to a noncombat mission in Afghanistan by the end of 2014, he said, Syria, Iran and North Korea all remain uncertain security challenges whose volatility requires a range of rapid and flexible response. With defense budgets over the foreseeable future also uncertain, the secretary said, vital strategic choices are unavoidable.

Hagel listed forces he said are “reshaping our world” as:


— The rising importance of Asia;

— The outbreak of revolution and sectarian conflict across the Middle East and North Africa;

— The continuing impact of financial crises and recessions in Europe;

— The “astounding diffusion” of global economic power as seen in the rise of China, India, Brazil and other countries; and

— The role of technology in closely linking the world’s people and their aspirations and economies.


In the face of rapidly developing and interconnected new threats such as cyber that fundamentally change the face of future conflicts, Hagel said, the military must reset from a defense enterprise structure that still reflects its Cold War design.

“To respond to this necessary effort our military is undertaking a series of important shifts that reflect changing geopolitical dynamics, new threats, new technologies, and new fiscal realities,” he said.

The first such major shift is a renewed emphasis on the Asia-Pacific region, he noted.

“We are undertaking this rebalancing because of the region’s growing importance to America’s future security and prosperity and because of the essential role the United States has played, and continues to play, in helping ensure peace and stability in this part of the world,” Hagel said.

The presence in the region of rising powers such as China, India and Indonesia, Hagel said, illustrates that all nations “have an interest in building a world order based on strong economic ties, mutual security interests, and respect for rules, norms, and the institutions that underpin them,” as well as human rights.

The U.S. rebalance to the Asia-Pacific is not a retreat from other regions, he added, but an acknowledgment of “changing strategic realities that direct increased engagement in Asia and [the] Pacific.”

In Asia and beyond, another shift in U.S. strategy is to build multilateral capabilities, he said.

“Our approach to security in the 21st century is to strengthen alliances, build new partnerships, and forge coalitions of common interest that help resolve problems and, hopefully, prevent conflict,” Hagel said. “We are doing this in Europe through our renewed commitment to the NATO Alliance, and in the Middle East and Latin America. All of these regions will help define the world’s future.”

He said the partnered approach, emphasizing joint exercises and other training activities between both regular and special operations forces, helps further another strategic shift, toward a lighter “footprint.”

A smaller footprint, as the military refers to the forward presence of troops, buildings and equipment, can be much less costly than a larger force and, Hagel noted, “also enables us to respond to crises more quickly and effectively.”

He emphasized the United States will maintain its capacity to meet its commitments and deter aggression, including by maintaining its nuclear “triad” of bombers, land-based intercontinental ballistic missiles and ballistic missile submarines, and by continuing to develop new cyber capabilities.

“At the same time, the most sustainable and wisest approach to our security in the 21st century will be to help allies do more to contribute to their own security and our common interests,” Hagel said.

Turning to defense spending, the secretary noted DOD has for several years been preparing for “an inevitable downturn,” but “a combination of fiscal pressures and a gridlocked political process has led to far more abrupt, deeper and steeper reductions than expected or planned for.”

The department is now grappling with both known and unknown budget factors, he noted, which include:

— Under sequester, a $37 billion cut this fiscal year, which ends Sept. 30;

— An additional sequester cut of another $52 billion next year, and $500 billion over the next decade; and

— The $487 billion, 10-year defense spending reduction agreed to in the Budget Control Act of 2011, which DOD is currently implementing.

“This has produced unprecedented uncertainty,” Hagel said, noting the strategic choices and management review he directed earlier this year is intended to develop choices, options and priorities to deal with and plan for further reductions.

“The department must understand the challenges and uncertainties, plan for the risks — and, yes — recognize there are opportunities inherent in budget constraints,” the secretary said.

He added that Pentagon leaders are studying the review’s findings now and “evaluating the options that have emerged.” More hard work and difficult decisions remain in the weeks and months ahead, he said.

The military must cut back on infrastructure and personnel costs, he said, echoing his own and other defense leaders’ testimony to Congress in recent weeks.

“As we do so,” Hagel added, “we must reassure the bright and patriotic young men and women who join the military that they will be fairly compensated, trained, and regarded as the professionals they are, and they will be given opportunities for career and personal enhancement. And they must be assured that their families will be taken care of.”

Hagel said his ultimate goal is “to ensure that the United States military remains in balance and America remains secure and strong.”

That balance involves strategic priorities, alliances, military capabilities and defense spending, he said.

“But there is one final area where balance must be achieved,” Hagel added, “and that is between America’s military and its other instruments of national power.”

Most of the nation’s pressing security challenges have political, economic, and cultural components, he said, and do not necessarily lend themselves to a military resolution.

Yet, Hagel added, the nation’s leaders “cannot let our military strength atrophy, either, in this very dangerous world. But that will require wise leadership capable of making tough decisions.”

What distinguishes the United States’ military is not its power, he said, but its purpose and commitment to making a better life for all people.

“America is a just, thoughtful and steady nation, worthy of its power, generous of its spirit, and still committed to the profession of peace in a complex yet hopeful 21st century,” Hagel said.

Hagel’s visit to the university is the first stop on a trip to his home state of Nebraska that will also include time at U.S. Strategic Command, headquartered at Offutt Air Force Base near Omaha.


Farewell Ceremony for Secretary of the Air Force Michael Donley

As Delivered by Secretary of Defense Chuck Hagel, Joint Base Andrews, MD, Friday, June 21, 2013


Ash, thank you, and to you, Mr. Secretary, General Welsh, and honored guests, departing secretary, America’s Mr. Fix-it, to the family, Gail, friends, past leaders, present leaders of this enterprise, and people who just stumbled in for the afternoon: we welcome you all. And in particular, these men and women in uniform today, who are scattered around this hangar, we welcome you, and we thank you, and we thank your families for what you’re doing, for what you will continue to do.

I have a prepared speech, Mike, but I will set it aside and attempt what all public officials should never attempt, and that is to go off-script, which my public affairs people are scared to death when that happens. But I want to make a couple of comments about Mike Donley, and I cover some of the same things that were so beautifully covered with Mark Welsh’s comments and Ash Carter’s.

So let me begin this way, first, to thank you, Gail and to your family, as Mark and Ash have appropriately done. Everyone here recognizes the family’s role in anyone’s successful life. And so we recognize you today, just as we recognize Mike.

Teddy Roosevelt once said — and I believe it’s inscribed in one of those magnificent stone slabs out on Teddy Roosevelt Island — that the two indispensables in life are character and courage. I begin my remarks this afternoon about Mike Donley because I think that’s where you begin with Mike Donley, character and courage. And a person’s life then flows from that and, as been noted by General Welsh and Secretary Carter, some of those accomplishments that Mike Donley is responsible for.

But one additional piece to all those accomplishments, which Mike Donley always noted, recognized, which is also an indispensable part of a successful life, personal and professional, and that is, it’s a team business. It isn’t one secretary, it is not one person. It’s not one service. It’s all of us.

And the test of leadership is, how do you bring that together? How do you bring that value and that character and that courage and that talent and capability behind one common purpose to make the world a better place? And that essentially is what the profession of arms is. It’s a profession of peace.

And it takes constant leadership. It takes constant attention. But it takes something else, and that is we all work from that special coin of the realm and it’s confidence and trust, and you earn that. You’re not born into anything with trust or confidence. Our leaders here today know that. And everything I have just noted, Mike Donley has lived that. Those words are words that he has lived by and what he is known for.

Here’s an individual who served five presidents, Republican, Democrat presidents. Mike Donley has always understood, the one, again, Teddy Roosevelt indispensable in life, is that national security, our future, our country, is beyond and above politics. And I suspect he was asked more than occasionally, how could you both work in the Bush White House and work as — continued to work as — Secretary of the Air Force under President Obama?

I don’t know what his response would be or was, but I suspect it was very simple. It’s bigger than presidents. It’s bigger than politics. It’s about our country. And if we had more of the Mike Donley attitude and sense of purpose in our country today, we’d probably all be a little better off, if we had that fundamental as the guiding purpose of responsible leadership in this nation. That has been an element of Mike’s personal life, as well as professional life, that I have admired.

I have not had the opportunity to work with Mike Donley that long, but the four brief months that I have had the privilege of serving this country as Secretary of Defense, there is one thing that I am absolutely sure about, when the books of history are written about Mike Donley’s tenure. The four months that I have worked with Mike have not been easy four months for the Air Force or for the Pentagon. I have been impressed, inspired by — he has never shied away from taking the big issues on, straight up, how do we fix it, I’ll take responsibility. That’s a pretty special element of anyone’s life. And that goes back to how I began, the first element of Teddy Roosevelt’s indispensable parts of a life are character and then the courage to live that character and implement that character.

I will miss him. As Mark and Ash have noted, and everyone in this hangar knows, the Air Force will miss him. The Pentagon will miss him. Our country will miss him.

But when Mike and I spent some time alone in my office this week, I gave him notice that he would not be completely untethered from us reaching him — not at all hours, Gail — I don’t think — unless it really gets bad, but — that’s a resource he, like so many of you sitting here today, are resources not only that this country has invested in, but we don’t want to lose. You will have a different life. You deserve a different life, I know. But it is not a life separated from the future of this country. And we all recognize that. And we will find you wherever you are at that apartment complex out in California on the beach — and I know you have big plans for him, doing that — but, Mike, it has been for me personally a great privilege to be part of your team.

And I also note that the Air Force, even though I was probably never smart enough to be in the Air Force — and I don’t want any of my Army friends to be upset with that remark — I didn’t have any choice, actually. The Air Force has been a very big part of my family, starting with the fact that my father served in the 13th Army Air Force in World War II and was a radio operator, tail gunner on a B-25 in the South Pacific for two-and-a-half years. And I don’t know of anything he was more proud of during his lifetime than his service in the Army Air Corps and his association with then what became the United States Air Force.

My brother, Mike, one of my brothers, is a commercial artist, illustrator, and led the Air Force Secretary’s Artist Guild for many, many years. And as Mike and all of our Air Force leaders know, the Pentagon and locations around the world are filled with the Air Force Artist Guild’s paintings of various scenes that reflect on great historical, defining moments in our history that were brought about by the actions of some of America’s greatest men and women and leaders, and many of those capture Air Force, Marines, Army, Coast Guardsmen, Navy, and their greatest accomplishments.

And I can tell you — I think as I have Mike — that my brother, Mike, did that for 14 years. Each of these artists contribute a painting once a year, gratis. Nothing that he’s been more proud of over his career, than that association with the Air Force.

So, to you, Mike, to your family, we wish you well. We know that your transition to a life that will be fulfilled because of not Mike’s title or who he represents, but who he is. And in the end, that’s all we have, is who we are. And that is much shaped by, at the end, your family, and your friends, and your past accomplishments.

And so, we will miss you. You know that. But as I have already noted, you won’t go far. Mike Donley, thank you. Thank you.


What They Told Us: Reviewing Last Week’s Key Polls

Rasmussen Reports

Saturday, June 22, 2013

Webster’s Dictionary defines “trust” as “assured reliance on the character, ability, strength or truth of someone or something.” Americans don’t have a lot of it these days as far as the federal government is concerned.

Just 35% of Americans now have a favorable impression of the federal government. That compares to 46% who view their state government favorably and 54% who feel that way about their local government.

Distrust is growing when it comes to the Internal Revenue Service’s targeting of Tea Party and other conservative groups. Sixty-one percent (61%) of voters now believe the IRS actions were politically motivated, and 70% think the orders came from Washington, D.C. Sixty-seven percent (67%) believe it is likely that other government agencies targeted conservative groups as well. All these levels of skepticism are up several points from a month ago.

President Obama and other senior government officials say the National Security Agency’s recently disclosed secret  surveillance program has deterred several terrorist attacks, but only 35% of Americans believe they are telling the truth. Forty-five percent (45%) think instead that they are just trying to justify the government’s spying on the phone and e-mail records of millions of innocent Americans.

Not that Americans are rallying to the man who leaked information about the surveillance program to the media. Just 12% view Edward Snowden as a hero, while 21% consider him a traitor. The majority of Americans think he’s something in between the two or feel it’s too early to decide.

After all, 57% think it is at least somewhat likely that public disclosure of the surveillance program has hurt this country’s national security. Still, only 15% think a reporter who gets leaked information from a government whistleblower should be prosecuted for publishing the information.

“None of the public players comes off looking great in the NSA story,” Scott Rasmussen says in his latest weekly newspaper column. “But there is now an opportunity for a healthy debate on the issue. It’s not partisan. President Obama and former Vice President Dick Cheney are on the same side. The nation needs a serious discussion about what kind of oversight and protection is needed to honor the Constitution while still effectively fighting the terrorists who want to end our way of life.”

Neither political party seems to have a trust advantage as voters increasingly sour on official Washington. Democrats and Republicans are now tied on the Generic Congressional Ballot at 39% apiece. This is the first time since June 2009 that both parties are below 40%.

Americans also are fed up with the so-called “revolving door” through which government officials frequently pass to take jobs with the very industries they regulated. Seventy-five percent (75%) favor at least a five-year ban on regulators working for companies they regulate, including 48% who think regulators should be banned for life from working for the companies they oversee. That’s a 16-point jump from two years ago.

Seventy-three percent (73%) don’t think it’s a good idea to let government regulators pass rules without approval from Congress. At the same time, 75% believe members of Congress should be required to publicly disclose all meetings and contacts with regulators and government officials. After all, just six percent (6%) of voters think Congress is doing a good or excellent job these days.

Meanwhile, the president’s overall job approval ratings fell this past week to their lowest level since last August.
This is despite the fact that the Rasmussen Consumer and Investor Indexes, which measure daily confidence in both groups, have been at or near six-year highs for several days.

Obama’s national health care law has been a point of contention among voters for much of his presidency, and they continue to give the president lukewarm marks for his handling of health care. Just 36% now rate the president’s handling of health care issues as good or excellent, down from 44% in February and the lowest finding this year.

With the health care law scheduled for full implementation by next year, voters are increasingly pessimistic about the short-term prospects for health care in America. Fifty-seven percent (57%) believe the U.S. health care system is likely to get worse in the next couple years, up from 50% last November.

So other than their local government, what do most Americans have faith in these days?

For one thing, 57% say it is not possible to have a healthy community without churches. Eighty-six percent (86%) think it’s good for a community to have a church provide services such as food banks, disaster relief, schools and hospitals.

At the same time (here comes the federal government again), 41% believe that the U.S. Supreme Court is too hostile towards religion.

Most working Americans (71%) would recommend their company as a good place to work, but then 66% feel their company values them as an employee. By comparison, just 16% of voters think most members of Congress care what they think.

In other news this week:

Twenty-nine percent (29%) of Likely U.S. Voters now say the country is heading in the right direction. That’s virtually unchanged from a year ago and consistent with voter attitudes for more than three months now.

– Peace talks between the United States and the Taliban to end the war in Afghanistan have stalled, but just one-in-three voters now consider America’s longest-running war a Very Important voting issue

– Fifty-three percent (53%) of Americans are at least somewhat confident in the stability of the U.S. banking industry today. In July 2008 shortly before the Wall Street meltdown, 68% were confident in U.S. banks. 

– Most Americans say they are paying about the same amount in interest rates compared to last year, but half now expect those rates to rise over the next year.

Most voters believe tax-exempt groups should publicly disclose all of their donors even though most recognize some will be harassed by political opponents. 

– With the U.S. Supreme Court poised to issue several major end-of-the term rulings, more voters than ever (40%) think the justices are too liberal

– With two months to go until the Democratic primary and four months until Election Day, Newark Mayor Cory Booker is in a strong position to become New Jersey’s next U.S. senator

Thirty-seven percent (37%) of New Jersey voters would like to see Governor Chris Christie run for president in 2016, and 44% of voters in his home state would vote for him. 

Jun 15 2013

June 15, 2013



Nine Companies Tied to PRISM, Obama Will Be Smacked With Class-Action Lawsuit Wednesday

AOL, Apple, Facebook, Google, Microsoft, PalTalk, Skype, Yahoo! and Youtube will be named in the suit, attorney says

US News

By Steven Nelson

June 11, 2013


Former Justice Department prosecutor Larry Klayman amended an existing lawsuit against Verizon and a slew of Obama administration officials Monday to make it the first class-action lawsuit in response to the publication of a secret court order instructing Verizon to hand over the phone records of millions of American customers on an “ongoing, daily basis.”

Klayman told U.S. News he will file a second class-action lawsuit Wednesday in the U.S. District Court for the District of Columbia targeting government officials and each of the nine companies listed in a leaked National Security Agency slideshow as participants in the government’s PRISM program.

According to the slideshow, the PRISM program allows government agents direct, real-time access to the servers of nine major tech companies, including AOL, Apple, Facebook, Google, Microsoft, PalTalk, Skype, Yahoo! and YouTube.

U.S. News did not seek comment from the companies, all of which have denied any knowledge of or participation in the PRISM program.

Klayman said he hopes the two lawsuits will be considered jointly as companion cases.

The class-action lawsuit against Verizon says the defendants violated customers’ “reasonable expectation of privacy, free speech and association, right to be free of unreasonable searches and seizures and due process rights.”

“This case challenges the legality of Defendants’ participation and conduct in a secret and illegal government scheme to intercept and analyze vast quantities of domestic telephone communications,” says the lawsuit against Verizon, which also names as defendants President Barack Obama, Attorney General Eric Holder, NSA director Keith Alexander and federal judge Roger Vinson, the FISA court judge who approved the leaked April order.

Klayman told U.S. News the Verizon lawsuit will be served to the named defendants. “Either they will file an answer or move to dismiss” the complaint, Klayman said, “[but] we’re confident the case will proceed.”

Within a few months, Klayman said, the court will likely define the “class” the suit seeks to represent by ruling that “everyone’s in” or by allowing Verizon customers to either opt-in or out-out of the class. Currently the suit only represents a Pennsylvania couple.

Sen. Rand Paul, R-Ky., said during a Sunday appearance on Fox News that he intends to file a class-action lawsuit against Verizon and the Internet companies over the surveillance programs, but it’s unclear how exactly he will proceed with the plans.

“I’m going to be asking all the Internet providers and all of the phone companies: Ask your customers to join me in a class action lawsuit,” Paul said Sunday. “If we get 10 million Americans saying we don’t want our phone records looked at, then maybe someone will wake up and something will change in Washington.”

Paul “effectively endorsed our case,” Klayman said, speculating that the senator “must have known about it.” Although a copy of the Verizon lawsuit was sent to Paul’s office, the attorney said he received no response. Klayman founded the pro-transparency legal group Judicial Watch in 1994 and currently leads an organization called Freedom Watch.

Documents exposing the Verizon phone-record collection order – which Sen. Diane Feinstein, D-Calif., said was a routine reauthorization of a seven-year-old practice – and the NSA’s PRISM program were handed over to the Guardian and Washington Post by former Booz Allen Hamilton employee Edward Snowden. He traveled to Hong Kong before leaking the documents and voluntarily revealed his identity Sunday.



Defending NSA Prism’s Big Data Tools

The more you know about NSA’s Accumulo system and graph analysis, the less likely you are to suspect Prism is a privacy-invading fishing expedition.

By Doug Henschen

Executive Editor, InformationWeek

June 11, 2013 02:26 PM


It’s understandable that democracy-loving citizens everywhere are outraged by the idea that the U.S. Government has back-door access to digital details surrounding email messages, phone conversations, video chats, social networks and more on the servers of mainstream service providers including Microsoft, Google, Yahoo, Facebook, YouTube, Skype and Apple.

But the more you know about the technologies being used by the National Security Agency (NSA), the agency behind the controversial Prism program revealed last week by whistleblower Edward Snowden, the less likely you are to view the project as a ham-fisted effort that’s “trading a cherished American value for an unproven theory,” as one opinion piece contrasted personal privacy with big data analysis.

The centerpiece of the NSA’s data-processing capability is Accumulo, a highly distributed, massively parallel processing key/value store capable of analyzing structured and unstructured data. Accumolo is based on Google’s BigTable data model, but NSA came up with a cell-level security feature that makes it possible to set access controls on individual bits of data. Without that capability, valuable information might remain out of reach to intelligence analysts who would otherwise have to wait for sanitized data sets scrubbed of personally identifiable information.

As InformationWeek reported last September, the NSA has shared Accumulo with the Apache Foundation, and the technology has since been commercialized by Sqrrl, a startup launched by six former NSA employees joined with former White House cybersecurity strategy director (and now Sqrrl CE0) Ely Khan.


“The reason NSA built Accumulo and didn’t go with another open source project, like HBase or Cassandra, is that they needed a platform where they could tag every single piece of data with a security label that dictates how people can access that data and who can access that data,” said Khan in an interview with InformationWeek.

Having left government employment in 2010, Kahn says he has no knowledge of the Prism program and what information the NSA might be collecting, but he notes that Accumulo makes it possible to interrogate certain details while blocking access to personally identifiable information. This capability is likely among the things James R. Clapper, the U.S. director of National Intelligence, was referring to in a statement on the Prism disclosure that mentioned “numerous safeguards that protect privacy and civil liberties.”


Are They Catching Bad Guys?

So the NSA can investigate data with limits, but what good is partial information? One of Accumulo’s strengths is finding connections among seemingly unrelated information. “By bringing data sets together, [Accumulo] allowed us to see things in the data that we didn’t necessarily see from looking at the data from one point or another,” Dave Hurry, head of NSA’s computer science research section, told InformationWeek last fall. Accumulo gives NSA the ability “to take data and to stretch it in new ways so that you can find out how to associate it with another piece of data and find those threats.”

The power of this capability is finding patterns in seemingly innocuous public network data — which is how one might describe the data accessed through the Prism program — yet those patterns might somehow correlate with, say, a database of known terrorists or data on known cyber warfare initiatives.

Sqrrl has supplemented the Accumulo technology with analytical tools including SQL interfaces, statistical analytics interfaces, text search and graph search engines, and there’s little doubt the NSA has done the same, according to Kahn. Graph search, in particular, is a powerful tool for investigation, as the NSA itself revealed last month when it shared at a Carnegie Mellon technical conference an in-depth presentation on the 4.4-trillion-node graph database it’s running on top of Accumulo.

Nodes are essentially bits of information — phone numbers, numbers called, locations — and the relationships between those nodes are edges. NSA’s graph uncovered 70.4 trillion edges among those 4.4 trillion nodes. That’s truly an ocean of information, but just as Facebook’s graph database can help you track down a long-lost high school classmate within seconds, security-oriented graph databases can quickly spot threats.

Kahn says a Sqrrl partner company that does graph analysis of internal network activity for security purposes recently identified suspicious activity using a graph algorithm. “Five days later, they got a knock on the door from the FBI letting them know that data was being exfiltrated from their network, likely by a foreign entity,” Kahn reports.

As we’ve reported, graph database technology dates back to the 1950s, but only recently has it advanced to truly big data scale, with Facebook exposing its Graph Search capabilities in January and NSA sharing details of its graph search capabilities last month.

Where prior intelligence techniques have largely been based on knowing patterns and then alerting authorities when those patterns are detected, security and intelligence analysts now rely on big data to provide more powerful capabilities than analytics alone.

“Graph analysis is just one really good technique for finding unknown patterns in data,” Kahn explains.


Do You Trust The Government?

In the end, assurances from Clapper, a former White House employee like Khan or even President Barak Obama may do little to assuage the concerns of privacy hawks, critics inside government or large swaths of American citizens. But those who known the technology used by the NSA know that Prism is not a simplistic, “collect first, ask questions later” expedition, and it’s not based on an “unproven theory.”

It’s likely no coincidence that suppliers of data to Prism such as Google have also been able to uncover espionage directed by foreign governments. In fact, a bipartisan Commission on the Theft of American Intellectual Property last month recommended increasing the budget of the FBI and Department of Justice to investigate trade theft, amending U.S. counter-espionage laws, and encouraging U.S. businesses to take the information security threat more seriously.

One government insider informs InformationWeek that he knows with certainty that “semantic and visual analytics tools have prevented multiple acts of terrorism.” That insight predates recent advances in graph analysis that are undoubtedly giving the U.S. Government even more powerful tools. Privacy concerns and the desire for checks on government access to private information must be considered, but we can’t naively turn a blind eye to very real threats by not making the most of advanced big data intelligence tools now at our disposal.


Snowden’s Security Breach: Is He Really the Only One?

By DAVID FRANCIS, The Fiscal Times

June 12, 2013


Three days after National Security Agency leaker Edward Snowden went public, the mystery around him and the information he illegally gave to reporters shows no signs of abating.

Snowden has simply vanished. He checked out of his Hong Kong hotel Monday and no one has seen him since. Whether he’s been nabbed by authorities or is on the run is anyone’s guess.

But the bigger mystery centers on what else Snowden knows. It’s not clear if he has more top-secret documents in his possession on top of the slideshow that revealed the PRISM program and a copy of an order from the Foreign Intelligence Surveillance Court.

Are there other documents that he’s in possession of as well? And if he does have other documents, what do they contain? The information in the slide show he leaked may be so sensitive that The Washington Post decided to run only four of 41 slides. What was on the other 37?

According to reports, despite top-level security clearance, Snowden has documents that he should not have had access to. For instance, administration officials questioned how he could have accessed the FISA court order, a document that should have been off-limits to someone in his position as an NSA contractor. Former NSA inspector general Joe Brenner suggested that Snowden might have used his IT position to give himself administrator access across multiple databases of classified materials.

“So many things are classified and so many things are top secret,” William Hartung, director of the Arms and Security Project at the Center for International Policy, told The Fiscal Times. “It can be some amazingly compromising information.”

This information includes an 18-page order from President Obama ordering the NSA and CIA to draw up a list of cyber warfare targets. Snowden said he leaked the cyber document as well.

Cyber directives and an NSA surveillance PowerPoint are likely the tip of the classified iceberg. Snowden might have had access to nuclear weapons plans or infrastructure vulnerability reports. He might have documents revealing the location of missile sites, or lists of CIA officers. He’s claimed as much, telling Greenwald he had “full access to the rosters of everyone working at the NSA, the entire intelligence community, and undercover assets all around the world, the locations of every station we have, what their missions are and so forth.”



Administration and intelligence officials are downplaying these claims, telling the Washington Post that they doubt his claims of widespread access. But if Snowden does have more sensitive information he has yet to leak, he could join other defense contractors who stole documents and changed history.

While working as a contractor in the Netherlands for the British nuclear power company the URENCO Group in the 1970s, Pakistani nuclear scientist A.Q. Khan began stealing designs for nuclear centrifuges. He subsequently sent them to Pakistan, giving the country the ability to build nuclear weapons.

When Dutch intelligence began to investigate, Khan fled to Pakistan and oversaw the construction of Pakistan’s military arsenal. Pakistan subsequently sold nuclear weapons plans to North Korea, Libya and Iran, drastically shifting the balance of world power (the Libyan nuclear weapons program was dismantled in 2003).



Even if Snowden did not have access to nuclear plans, it’s likely that someone like him – a civilian with security clearance – does. There are 500,000 private contractors with top-secret access. When government employees are included, that number swells to 1.4 million with security clearance, according to The Washington Post.

Steven Aftergood, director of the Federation of American Scientists’ Project on Government Secrecy and an expert on security clearance, says that gaining the necessary clearance is not as difficult as one might think.

“It’s a process of vetting either government employees of contractors to ensure that they are trustworthy and loyal to the United States and therefore eligible to have access to classified info,” he told The Fiscal Times. “There are disqualifying factors: heavily in debt, an alcohol or drug problem, or a history of sexual misconduct of criminal behavior. Otherwise, chances are you can get it.”

But as Snowden illustrates, this does not always work. According to reports, the rush to fill security clearance jobs has diluted the hiring and vetting process, making it possible for people to slip through the cracks.



Right now, Snowden has few good options. It’s possible he’s already been captured. It’s possible that he’s on the run. Without a fake passport, he can’t leave China: The electronic dragnet he felt obligated to expose would identify him immediately.

This backs him into a corner. He could ask for asylum at an embassy of a country that does not have an extradition treaty with the United States. He could take refuge at the Russian embassy, a country that has already offered him asylum and is very interested in the documents he has. He could also approach the Chinese government to try to trade the secrets he has for a safe home there.


Colluding with the Russians or the Chinese would end the debate about his status as a whistleblower. It would make him a spy. But maybe Snowden has a fake identity and is already off the grid.

At this point in the mystery, it wouldn’t be a surprise if the high school dropout were smarter than us all.

Three days after National Security Agency leaker Edward Snowden went public, the mystery around him and the information he illegally gave to reporters shows no signs of abating.

Snowden has simply vanished. He checked out of his Hong Kong hotel Monday and no one has seen him since. Whether he’s been nabbed by authorities or is on the run is anyone’s guess.

But the bigger mystery centers on what else Snowden knows. It’s not clear if he has more top-secret documents in his possession on top of the slideshow that revealed the PRISM program and a copy of an order from the Foreign Intelligence Surveillance Court.

Are there other documents that he’s in possession of as well? And if he does have other documents, what do they contain? The information in the slide show he leaked may be so sensitive that The Washington Post decided to run only four of 41 slides. What was on the other 37?

According to reports, despite top-level security clearance, Snowden has documents that he should not have had access to. For instance, administration officials questioned how he could have accessed the FISA court order, a document that should have been off-limits to someone in his position as an NSA contractor. Former NSA inspector general Joe Brenner suggested that Snowden might have used his IT position to give himself administrator access across multiple databases of classified materials.

“So many things are classified and so many things are top secret,” William Hartung, director of the Arms and Security Project at the Center for International Policy, told The Fiscal Times. “It can be some amazingly compromising information.”

This information includes an 18-page order from President Obama ordering the NSA and CIA to draw up a list of cyber warfare targets. Snowden said he leaked the cyber document as well.

Cyber directives and an NSA surveillance PowerPoint are likely the tip of the classified iceberg. Snowden might have had access to nuclear weapons plans or infrastructure vulnerability reports. He might have documents revealing the location of missile sites, or lists of CIA officers. He’s claimed as much, telling Greenwald he had “full access to the rosters of everyone working at the NSA, the entire intelligence community, and undercover assets all around the world, the locations of every station we have, what their missions are and so forth.”



Administration and intelligence officials are downplaying these claims, telling the Washington Post that they doubt his claims of widespread access. But if Snowden does have more sensitive information he has yet to leak, he could join other defense contractors who stole documents and changed history.

While working as a contractor in the Netherlands for the British nuclear power company the URENCO Group in the 1970s, Pakistani nuclear scientist A.Q. Khan began stealing designs for nuclear centrifuges. He subsequently sent them to Pakistan, giving the country the ability to build nuclear weapons.

When Dutch intelligence began to investigate, Khan fled to Pakistan and oversaw the construction of Pakistan’s military arsenal. Pakistan subsequently sold nuclear weapons plans to North Korea, Libya and Iran, drastically shifting the balance of world power (the Libyan nuclear weapons program was dismantled in 2003).



Even if Snowden did not have access to nuclear plans, it’s likely that someone like him – a civilian with security clearance – does. There are 500,000 private contractors with top-secret access. When government employees are included, that number swells to 1.4 million with security clearance, according to The Washington Post.

Steven Aftergood, director of the Federation of American Scientists’ Project on Government Secrecy and an expert on security clearance, says that gaining the necessary clearance is not as difficult as one might think.

“It’s a process of vetting either government employees of contractors to ensure that they are trustworthy and loyal to the United States and therefore eligible to have access to classified info,” he told The Fiscal Times. “There are disqualifying factors: heavily in debt, an alcohol or drug problem, or a history of sexual misconduct of criminal behavior. Otherwise, chances are you can get it.”

But as Snowden illustrates, this does not always work. According to reports, the rush to fill security clearance jobs has diluted the hiring and vetting process, making it possible for people to slip through the cracks.



Right now, Snowden has few good options. It’s possible he’s already been captured. It’s possible that he’s on the run. Without a fake passport, he can’t leave China: The electronic dragnet he felt obligated to expose would identify him immediately.

This backs him into a corner. He could ask for asylum at an embassy of a country that does not have an extradition treaty with the United States. He could take refuge at the Russian embassy, a country that has already offered him asylum and is very interested in the documents he has. He could also approach the Chinese government to try to trade the secrets he has for a safe home there.

Colluding with the Russians or the Chinese would end the debate about his status as a whistleblower. It would make him a spy. But maybe Snowden has a fake identity and is already off the grid.

At this point in the mystery, it wouldn’t be a surprise if the high school dropout were smarter than us all.



Opening Summary — SAC-D (Budget Request)

06/11/2013 02:29 PM CDT

As Delivered by Secretary of Defense Chuck Hagel, Washington D.C., Tuesday, June 11, 2013


Chairman Durbin, Ranking Member Cochran, distinguished members of the subcommittee, thank you for the opportunity to discuss the President’s F.Y. 2014 budget request for the Department of Defense.

I also appreciate this subcommittee’s continued support of our men and women in uniform and our civilian work force and their families. As we discuss numbers, budgets, and strategic priorities, we will not lose sight of these men and women serving across the globe. As you all know, their well-being depends on the decisions we make here in Washington.

The President has requested $526.6 billion for the Department of Defense’s F.Y. 2014 base budget and $79.4 billion for overseas contingency operations (OCO).

My written statement, Mr. Chairman, as you know, contains details on both budget requests. This morning, allow me to, very briefly, focus on three areas before I take your questions:

• first, the continued budget challenges facing the Department in F.Y. 2013 as a result of sequestration, as you have noted as well as Senator Cochran;

• second, the Department’s F.Y. 2014 budget request;

• and third, how the Department is preparing for the future budget uncertainty and the prospect of further reduced resources as a result of sequestration.

As you all know, the Department has been forced to implement deep and abrupt cuts in the current fiscal year because of sequestration. According to the latest guidance from the Office of Management and Budget, the Department must cut $37 billion in spending through the remainder of this fiscal year.

With our internal decision to shift the impact of sequestration away from those serving in harm’s way and force readiness, the cuts fall heavily on DOD’s accounts that train and equip those who will deploy in the future. The Department is also experiencing higher wartime costs than expected. As a result of these factors, the Department is facing a shortfall of more than $30 billion in our operation and maintenance (O&M) account for F.Y. 2013.

To deal with the shortfall, the Department has cut back sharply on facilities maintenance, instituted hiring freezes, cut overhead spending, reduced important, but lower priority programs, directed furloughs of nearly 700,000 civilian employees, and submitted, as you have noted, a $9.6 billion reprogramming request to Congress. And we ask this subcommittee for your assistance in providing rapid review and approval of this critical reprogramming request.

Given the scale of this shortfall, the reprogramming and other steps we’ve taken to cut nonessential spending are not enough. While we have protected spending to sustain the war effort and defend America’s vital strategic interests, the Department’s day-to-day activities will be significantly disrupted for the remainder of this fiscal year.

Each of the military services has begun to significantly reduce training and maintenance of non-deployable operating forces. As you have both noted, you have had the Chiefs before this committee, and they have made some very significant detailed presentations and accounted for these cuts. For example, the Army has stopped rotations at its key combat training centers for all but deploying units. More than a dozen combat coded Air Force squadrons either, already have, or will soon stop flying, and the Navy has curtailed many deployments.

To avoid even more significant reductions to military readiness, I directed furloughs of up to 11 days for most of the Department’s 800,000 civilian personnel. Mr. Chairman, I made this decision very reluctantly. I made it decision reluctantly, because I recognize the significant hardship this places on civilian personnel across our country and especially on their families. But the current budget environment is requiring difficult decisions and options.

Now, let me turn to F.Y. 2014. The President’s F.Y. 2014 budget continues to implement the $487 billion in spending reductions over the next 10 years agreed to in the Budget Control Act of 2011. If the sequester-related provisions of the Budget Control Act are not changed, F.Y. 2014 funding for national defense programs will be subject to an additional $52 billion reduction in DOD funding. And if there are no changes, continued sequestrations will result in roughly $500 billion in additional reductions to defense spending over the next 10 years.

The President’s F.Y. 2014 budget replaces sequestration and gives the Department the time and flexibility to plan and implement spending reductions wisely and responsibly.

In particular, this budget enables the Department to support troops still at war in Afghanistan, protect readiness, modernize the military’s aging weapons inventory in keeping with the President’s strategic guidance and sustain the high quality of the all-volunteer force that you noted that was in General Dempsey’s speech at the National Press Club.

This budget also continues the Department’s approach of the last couple of years of targeting growing costs in support areas like overhead, acquisition and pay and benefits.

Over the next five years, DOD has identified $34 billion in new savings across these categories. This includes weapons program restructuring and terminations that will achieve $8.2 billion in savings, slowdowns in military construction and reductions in other lower priority programs. Our military compensation package preserves DOD’s world-class pay and benefits while putting our military on a more sustainable path to the future. It includes changes to the TRICARE program to bring the beneficiaries’ cost share closer to the levels envisioned when the program was first implemented.

The Department of Defense also must be able to eliminate excess infrastructure. The President’s F.Y. 2014 budget requests authorizations for one round of Base Realignment and Closure (BRAC) in 2015. BRAC, as we all recognize, is an imperfect process. And there are upfront costs. But in the long term, there are significant savings. The previous rounds of BRAC are saving $12 billion annually. We cannot justify to continue funding unnecessary infrastructure when we are reducing our force structure. Since 2003, DOD has divested more than 100 foreign bases and operations, and we are on schedule to close or consolidate over 20 more overseas operations.


Although there are clearly opportunities to achieve significant savings by improving efficiency, consolidations and reducing overhead, the scale of the current spending reductions will also require cuts and changes to military operations.

The fiscal 2014 budget request seeks to further align budget programs with the budgets of the President’s defense strategic guidance. While continuing to reduce the size of the ground forces and retire aging aircraft and ships, this budget invests in key elements of our defense strategy, including implementing our rebalance to the Asia-Pacific region, maintaining a safe, secure and effective nuclear stockpile, increasing investment in cyber-capabilities and sustaining the growth of special operations forces.

Finally, this budget seeks to preserve a combat-ready force and sustain the high-quality, all-volunteer force.

Last point, the F.Y. 2014 budget reflects DOD’s best efforts to match ends, ways and means during a period of very intense fiscal uncertainty. It is obvious that significant changes to the Department’s top line spending would require changes to this budget plan. Consequently, I directed, as you have noted, a Strategic Choices and Management Review in order to assess the potential impact of further reductions and plan for those continued reductions. I have received the initial internal results of this review and I am currently reviewing those options and those choices.

The Defense Department will continue to find new ways to operate more affordably, efficiently and effectively. However, as I’ve stated, continued cuts on the scale and the timeline of sequestration will require significant reductions in military capabilities and the scope of our activities around the world.

The President’s F.Y. 2014 budget sustains our military strength in an environment of constrained resources, giving DOD the time and the flexibility to make the necessary reductions and adjustments over a ten year period.

Hard choices, Mr. Chairman, will have to be made over these next few years. In the past, many modest reforms to personnel and benefits, along with efforts to reduce infrastructure and restructure acquisition programs were met with fierce political resistance, and they were never implemented. We are now in a different fiscal environment. New realities are forcing us to more fully confront these tough and painful choices. And to make the reforms necessary to put this Department on a path to sustain our military strength for the 21st century and meet these new and complicated threats we will have to do things differently. This will require the continued partnership of Congress.

Thank you.




Springfield to Get Drone Headquarters

Andrew McGinn

Source: Dayton Daily News, Ohio

Created: June 12, 2013


June 12–SPRINGFIELD — An office on Springfield’s eastern edge will serve as the base of operations for Ohio and Indiana’s joint effort to become a test site for unmanned aerial systems, an industry that’s predicted to create thousands of new jobs.

The Ohio/Indiana Unmanned Aerial Systems Center and Test Complex will be housed in leased office space at the Nextedge Applied Research and Technology Park along U.S. 40 in a building owned by Advanced Virtual Engine Test Cell, better known as Avetec.

The Ohio Department of Transportation, which will manage the complex, picked the space from among 15 possible sites, 12 of which were located in Montgomery and Greene counties.

The state controlling board Monday approved ODOT’s two-year lease at a total cost of $70,000 that will run from July 1 until June 30, 2015.

It’s hoped the creation of the test complex will bolster the two-state effort to win one of six UAS test sites being created nationally later this year by the Federal Aviation Administration. The FAA will use the sites to gather safety and privacy data needed before drones are allowed into manned airspace in 2015.

An industry report in March predicted that unmanned aircraft will create more than 2,700 new Ohio jobs by 2025 and generate $2.1 billion in development. That same report by the Association for Unmanned Vehicle Systems International stated the selection of the test sites will help determine where jobs flow.

Two dozen applicants representing 23 states are in the running for the six test sites, said Joe Zeis, executive vice president and chief strategic officer for the Dayton Development Coalition.

Other states have upped their efforts.

North Dakota recently created the Northern Plains UAS Test Site, which will be led by a former Air National Guard wing commander, with $1 million to pursue an FAA test-site designation. The governor there last month appropriated $4 million more if the FAA picks his state.


“The stakes are pretty high,” Rob Nichols, press secretary for Ohio Gov. John Kasich, said Tuesday, preferring not to say what, exactly, the test complex here will do for the Ohio-Indiana application.

The 2,060 square feet of leased space would serve not only as a base of operations for the flight testing of unmanned aircraft in Ohio and Indiana but also as a site for data analysis, modeling, simulations and mission planning, according to a news release from state Sen. Chris Widener, R-Springfield.

The not-for-profit Avetec already has modeling and simulation capabilities. The organization strives to reduce the cost and time it might take the military to develop and test jet engines by doing it virtually.

“It’s a coup on one hand, but it more reflects the positive things Springfield is bringing to the table for UAVs in this region,” Tom Franzen, assistant city manager and director of economic development, said.

Even if the FAA doesn’t pick Ohio and Indiana, drone testing will be done in the region, Nichols said, because of a NASA challenge grant for sense-and-avoid technology.

With the presence of Wright-Patterson Air Force Base, the Dayton-Springfield region has long been viewed as the state’s bastion of UAS research and development. The availability of restricted airspace in neighboring Indiana makes for a “wonderfully complementary partnership,” Zeis said.

Aside from a smattering of businesses locally already involved in UAS, Springfield’s other chief contribution to that partnership is the Springfield-Beckley Municipal Airport, where drones will be launched and recovered on their way to and from restricted airspace. Unmanned aircraft also will be flown from the Wilmington Air Park.

Avetec — and Nextedge in general — has been connected via a fiber-optic ring since 2010 to the Springfield airport and to Wright-Patterson, a feature that factored into the site selection.

“That site adds a tremendous amount of value to the state’s efforts, not just ours,” Franzen said.

The selection also renews hope for the 205-acre local technology park that had stagnated so much during the recession that the bank eventually auctioned it last year for a fraction of the nearly $15 million invested in it. That investment included $6.3 million in public funding for the main infrastructure.

Developed in 2004, when “nobody even knew what a UAV was,” Franzen said, Nextedge might finally get to live up to its name by attracting drone developers.

“This will be a nice reminder to folks that we have this amenity in the region,” Franzen said.




IEEE Spectrum

Farming by the Numbers

Precision agriculture brings computing’s accuracy to the ancient art of cultivation

By Ariel Bleicher

Posted 30 May 2013 | 20:44 GMT


One bone-chilling September morning, Clay Mitchell drives his combine harvester through a field of amber Iowa corn, elsewhere known as maize. Beneath him, the monstrous machine reaps and threshes the plants, winnows the grain from the chaff, and then blows the yellow kernels into a holding tank, where they pile up in mounds as lustrous as a leprechaun’s gold.

Thanks to this mechanization as well as to advances in seed genetics and chemical fertilizers, Mitchell and his neighbors can now wrest higher yields than any generation before them. They are some of the most successful corn producers in the United States, which grows about 40 percent of the world’s corn supply. But the big machines that extract this bounty have taken a toll on the land. Decades of tillage and compaction have weakened the soil, making it prone to heavy erosion from wind and rain.

Agronomists estimate that since large-scale farming began here about 150 years ago, Iowa has lost roughly half its topsoil. In the mid-19th century, the average farmer could dig a hole in his field and stick his arm in as deep as the elbow before he felt something other than black, nutrient-rich earth. Today, he’s lucky if he gets as far as his wrist. Recent measurements show that in many parts of the state, topsoil is washing away 10 to 50 times as fast as it can re-form. There is evidence that such rapid degradation is already depressing yields.

What’s happening in Iowa isn’t unique. Studies suggest that every year, one hectare of farmland anywhere on Earth loses on average 30 metric tons of soil—enough to fill a large shipping container. In some places, erosion has degraded farmland so severely that it has been abandoned. As the world’s population rises, farmers are faced with having to grow more crops on less soil. “Globally, we are on the fringe of not being able to meet the food demand that exists today,” says Richard Cruse, a professor of agronomy at Iowa State University.

Yet there may be reason for hope. A new tech-driven trend—some even say revolution—in agriculture is beginning to take hold in Iowa and in many parts of the world. Known as precision agriculture, the practice promises to reconcile mass food production with responsible land stewardship. It is perhaps best summed up by the oft-cited mantra “Doing the right thing, at the right place, at the right time, in the right way.”

The concept harks back to agriculture’s early days, when a farmer, working by hand, could give each plant the exact amount of nutrients, water, weeding, and pest killing it needed without disturbing the soil very much. That era is long gone. But now information and communications technologies are bringing it back. By adopting smart sensors and sophisticated mapping tools, farmers like Mitchell are aiming for the same level of meticulousness as their pick-and-hoe-wielding predecessors—on scales larger than was ever possible before.

Mitchell was one of the first commercial farmers to seriously pursue modern precision farming. Now 40 years old, he shares an engineer’s compulsion to continually improve the less-than-perfect. “When it comes to striving for absolute accuracy, Clay’s as dead-on as they get,” says Paul Schrimpf, a group editor at the trade publisher PrecisionAg.

Mitchell grew up on the same land he still farms with his father and uncle in eastern Iowa, where five generations of Mitchells have made their livelihoods since the 1870s. One of his earliest memories is of helping his father build a system of sensors and hardwired circuits for automating the flow of grain through a series of bins that dry and store the harvest. “As soon as I could turn a screwdriver, my dad would give me 10 cents for every relay I would wire up,” Clay recalls. Those old switches have since been replaced with programmable logic. Today, father and son control the bins through their smartphones.

As a young man, Mitchell took a break from farming to study biomedical engineering at Harvard University. But his heart was in Iowa. When he returned in 2000, he immediately began looking for ways to streamline his family’s operation.

The first challenge he took on was fine-tuning the process of planting and fertilizing seeds. Most of his neighbors would simply broadcast fertilizer over an entire field, till it into the ground, and then drop seeds in neat rows. But the Mitchells wanted to avoid both broadcasting, which wastes fertilizer on unseeded rows, and tilling, which intensifies erosion. So they bought a tractor attachment that cut grooves in the soil and injected fertilizer into the grooves. They figured they could then set seeds directly on top of each nutrient band.

They were wrong. They found it impossible to guide the planter along the same path as the fertilizer applicator. “You just couldn’t visually see where you drove before,” Mitchell says. He pondered solutions involving lasers and cameras. Then one day, he read that civil engineers were starting to use specialized navigation equipment to control earthwork machinery. “And I thought, ‘That’s perfect!’ “

The product he discovered relies on signals from GPS satellites. But it is magnitudes more reliable than an off-the-shelf GPS unit. Using a signal-processing scheme known as real-time kinematics, or RTK, it can boost the accuracy of a GPS reading from about 3 meters to within just a few centimeters.

Unlike traditional GPS navigation, RTK requires the installation of a local base station, which Mitchell affixed to his parents’ old ham radio tower. He mounted a second receiver on his tractor’s roof. When he’s using the system, the fixed receiver measures the distances to several satellites using GPS signals, compares those to the real distances calculated from its known position, and then reports the errors to the tractor over a radio link. (It sends an update about every 50 milliseconds.) These correction codes enable the tractor to compute its location with greater certainty.

And RTK can refine this calculation even further. A typical GPS receiver determines distance by internally generating the same pseudorandom code that a satellite transmits. Then, by comparing the time shift between the two identical sequences, it calculates the satellite’s range. But because tens of microseconds pass between each bit, the receiver can resolve timing errors with only limited fidelity. RTK solves this problem by also tracking the waveform that carries the code, which cycles at a much higher frequency. This enables the receiver to estimate the start of each cycle with finer precision. Correction signals from the base station then help the receiver determine the exact number of wavelengths between it and the satellite.

Today, RTK and similar systems are most commonly used in agriculture for autosteering, which keeps a vehicle moving autonomously along a straight course. (Drivers must still make turns manually.) Autosteering has become so popular among farmers that tractor manufacturers now build it into all new models. But back then, few growers had heard of RTK. “People out here thought it was absurd,” Mitchell says. “Nobody saw an application for it.”

To Mitchell, RTK was the gateway to ever more sophisticated precision tools. He installed autosteering in his tractor and, with his father’s help, rigged up hydraulic steering and autocontrol in the planter and fertilizer applicator so that the attachments wouldn’t sway off course. He wired up a laptop computer inside the tractor’s cab, and he loaded it with software that maps his fields and keeps track of where he has been and what he has done. For instance, he can use the program to record where he has injected fertilizer and direct the planter to drop seeds in the same spots. The setup is so precise that it allows him to space each seed exactly between last season’s stalks, so that the new shoots won’t struggle to push through the debris.

Of all the Mitchells’ various cost-cutting, yield-boosting inventions, the capstone is their sprayer. Most farmers want to spritz their plants with precisely the amount of pesticides and herbicides that manufacturers recommend. They know that overtreating costs money and can pollute watersheds and damage crops. Undertreating, on the other hand, can breed chemicalresistant superweeds and pests by killing off all but the most hardy invaders. In practice, though, it’s easy to pass over parts of a field twice or to speed through turns, thereby double-dosing some plants while lightly dosing others.

Autosteering helps, but only some. Farmers still must manually steer their sprayers’ giant winglike booms—the Mitchells’ is 18 meters long—around field boundaries and waterways. These are places where spray rates vary wildly, simply because the boom moves faster on the outside of a turn than on the inside. “The errors are so great because the machinery doesn’t compensate,” Mitchell says.

The Mitchells’ sprayer is a John Deere model from 1998. Over the years, they have beefed it up it to do things that even today’s commercial sprayers can’t do. “You can’t just go out and buy this,” Mitchell says. One of the earliest modifications they made was to connect a programmable logic controller to valves that turn the spray nozzles on and off. They linked the controller to a display inside the cab and loaded the computer with digital maps of their fields. Now when Mitchell sprays a field, the RTK-equipped machine automatically clicks off any nozzles that pass over areas he has already sprayed or doesn’t want to hit.

The next challenge was making dose rates consistent along the radius of a turn. First the Mitchells replaced each nozzle with a set of three spouts of different sizes. This way, the machine could toggle among the three rates without having to increase pressure, which could cause droplets to drift. Then they stuck a potentiometer on the machine’s front axle, enabling the control system to calculate the relative speed of each section of the boom and vary its spray rates accordingly.

Yet Mitchell and his father felt the system could be even more exact. Because it relied on RTK readings to determine acceleration, it would switch the spray rate about one second after the sprayer’s speed had actually changed. “There was a lag,” Mitchell says. “So when you’d be speeding up or slowing down rapidly, like you do in small fields, you’d always be putting on the wrong rate.”

The Mitchells solved this problem by feeding data from the driver’s joystick accelerator to the logic controller. Then they programmed the system to anticipate speed changes based on the driver’s commands. So by the time the signals reached the drivetrain, the nozzles had already adjusted. “We’re now getting to the point where every square meter of the field gets the target rate,” Mitchell says.

Many of the precision innovations that the Mitchells have adopted or built themselves are no longer unique. “There was a time it seemed we were dragging everyone along,” Mitchell’s father, Wade, says. “Now we can’t keep up.”

Manufacturers of automated agricultural equipment now offer all sorts of bells and whistles, including various rate controllers and yield monitors. More and more they are taking advantage of wireless technology. John Deere, for example, is developing a suite of products under a strategy it calls FarmSight that let large-scale operators remotely monitor fleets of machines over a cellular network. “The human-technology interface hasn’t yet been mastered in agriculture,” says Tony Thelen, operations director of the company’s Intelligent Solutions Group. “Our efforts are to make data collection and decision support ‘walk-up’ easy.”

Yet worldwide, farmers like the Mitchells are still a minority. And persuading others to embrace precision management, particularly in the developing world, will take more than smarter tools, says Jerry Hatfield, laboratory director of the National Laboratory for Agriculture and the Environment. “I don’t think we’re technologically limited,” he says. “I think we’re human limited—not from an intellectual incapacity but from the will to adopt these things.”

“He is saying precision agriculture is just a theory,” Ezrin Mohd Husin translates for a Malaysian rice farmer who goes by the name Jailani. They stand on the bank of an irrigation canal overlooking Jailani’s paddy fields, where workers are busy “transplanting” little carpets of nursery-grown seedlings into the soaked earth. It is a steamy day in February, and Jailani wears knee-high rubber boots and a brimmed cotton hat. Ezrin, whose collared shirt is embroidered with the words Technology for Precision Agriculture, adds, “He is saying he is a farmer and he knows better than theory.”

Ezrin works for the Smart Farming Technology Research Centre at Universiti Putra Malaysia, a short drive south of the country’s capital of Kuala Lumpur. Its modest staff includes some of the few engineers in the world who have had success bringing precision practices to community farmers. In Malaysia, as in many developing nations, small family farms produce the bulk of the country’s staple crop. But for decades, Malaysian paddy farmers have grown only 70 percent of the rice Malaysians eat. “If grain prices rise globally, or if for any reason Thailand refused to export their rice, we are in trouble,” Ezrin says.

His fears are not merely hypothetical. In January 2008, a sudden spike in food and fuel prices triggered a worldwide cascade of rice hoarding. While exporters such as Egypt, India, and Thailand stockpiled their surpluses, Malaysia and other importers found themselves lacking. Not long after the “great rice crisis” ended that May, the Malaysian government set an ambitious goal to become wholly self-sufficient in rice by 2015.

In five years, though, the country has made virtually no progress toward reducing its dependency on foreign grain. Ezrin and his colleagues believe the reason is simply that the government hasn’t applied the right solution. “If you implement precision farming, it is possible to have full food security,” says Mohd Amin Mohd Soom, the center’s director.

Amin’s approach to precision agriculture is what’s often called variable rate. It differs from the kind of tactics the Mitchells use in that it does not assume every part of a field should be treated the same. Instead, farmers who practice this method take into account variations in soil composition and plant growth to determine exactly how much of something—water or fertilizer, say—each tiny piece needs.

Amin has so far focused on fertility because he believes it can make the biggest impact. Malaysian farmers typically apply chemical nutrients uniformly based on rates the government recommends, he says. His team has been trying to convince them they can get higher yields and save costs if they can figure out which areas should get more fertilizer and which ones should get less.

This isn’t an easy calculation. Farmers first need to be able to measure soil fertility quickly and cheaply, which means they can’t rely on laboratory tests. A simpler solution is to deploy electrical-conductivity sensors. For more than a century, geologists have used these sensors to chart Earth’s interior and to sniff out oil and gold. Around the mid-1990s, growers in the United States and Europe began borrowing the technique to survey their soils.

The most popular mapping system came from a Kansas company called Veris Technologies. It consists of a wheeled frame, which supports an array of platter-size steel disks. As a tractor pulls the cart through a field, the disks penetrate the soil. One pair of disks creates a voltage while a second pair measures how much the current drops as it flows through the ground. By using pairs of disks spaced at different intervals, farmers can gauge conductivity at various depths. And by equipping the system with satellite navigation, they can construct a detailed picture of soil variability—including differences in texture, water-holding capacity, and, to some extent, nutrient contents.

In 2004, Amin bought Malaysia’s first Veris machine with money from a government grant. The pilot study included only eight farmers, whose paddies surrounded the coastal community known as Tanjung Karang. Amin’s team mapped the electrical conductivity of the farmers’ fields and took several hundred soil samples over many seasons. Each season, the researchers used the results to model the relationship between conductivity and the amount of essential nutrients in the soil. Then they used the model to create color-coded maps showing farmers how best to distribute fertilizer. In some seasons, the model represented a dependence that was statistically significant. In other seasons it didn’t, but they used it anyway. “It’s not 100 percent accurate,” admits Aimrun Wayayok, a soil expert at the research center.

This unreliability is why some farmers, including the Mitchells, are skeptical of variable-rate methods. “Sprinkling a little bit less phosphorus on one part of the field and a little more on another doesn’t have a predictable benefit,” Mitchell argues. “It’s a very small Band-Aid.”


Aimrun disagrees. Electrical conductivity may not be the perfect predictor of soil fertility, he concedes, but it’s a start. And the maps have encouraged farmers to be more conscientious of how much fertilizer they use. In some cases, Aimrun says, they have cut their use by as much as 70 percent. “We are showing them that reducing fertilizer doesn’t necessarily reduce yield and sometimes increases it,” he says.

“Agriculture systems are very complex,” says John Schueller, an expert in precision farming at the University of Florida. “It is impossible to optimize completely. Even in Iowa, where farmers have access to the best technologies, they’re really not hitting absolute perfect conditions. So the question is: How close can you get? Usually doing something is better than doing nothing.”

Today, about 30 farmers in Tanjung Karang are enrolled in the center’s precision program. And in the nine years since the project began, the area has upped production from 4 metric tons per hectare (the national average) to more than 6 metric tons per hectare.

It is arguable that a big driver of this success has been broadband access. In 2008, the center oversaw the construction of an Internet café in Tanjung Karang. Here, Amin and his staff teach farmers in the community how to use computers and browse the Web. They’ve also created a custom website where the farmers can access fertility maps for their fields and track their activities throughout the growing season.

On this February day, the café is quiet and airy. Its picture windows frame undulating landscapes of tall green rice grass. A few teenagers sit at terminals Facebooking and playing video games. On a wall, someone has tacked a poster-size copy of a check for 50 000 Malaysian ringgits (about US $16 000). Printed on the bottom in Malay is the declaration “Best Paddy Field Management, Selangor AgroFest 2011.”

Eventually, a farmer named Aziz wanders into the café. He points to the check. “He says he is proud,” Ezrin translates. “He says previously, he protested precision farming. He could get 7 [metric] tons per hectare, which is very good. But then he tried precision farming and now he gets 9 tons—2 tons extra! And with half the fertilizer bags!”



Hands-free tech not safe for driving, study shows

Dayton Daily News

Posted: 12:05 a.m. Thursday, June 13, 2013

By Lou Grieco

Staff Writer


Hands-free technologies for making phone calls or sending texts are not any safer for drivers, whose abilities decrease as distractions increase, according to a study sponsored by the AAA Foundation for Traffic Safety.

“People go into what we like to call ‘tunnel vision’ as they become more distracted,” said Cindy Antrican, public affairs manager for Dayton’s AAA office. “Their brains stopped thinking about what they were doing.”

The study, released Wednesday and done by cognitive distraction experts at the University of Utah, measured brainwaves, eye movement and other metrics to see the impact to drivers as they attempted different tasks. The study found that listening to the radio was a minimal risk. Talking on a cellular phone, whether hand-held or hands-free, was rated a moderate risk.


But listening to and responding to voice-activated email or texting features was rated an extensive risk, and citing a predicted five-fold increase in infotainment systems in new vehicles by 2018, AAA has called for auto manufacturers to add safety features, such as disabling voice-to-text technologies while the vehicle is in motion.

Already, people are dictating texts, emails or social media posts while driving, using voice-activated programs installed in newer cars. IHS Automotive reported that, in 2012, 80 percent of the new cars sold in North America and Europe included some type of voice-activated controls. But the study found that, the more distracted drivers became, the less they checked mirrors, looked around or processed objects they encountered.

“They couldn’t see things right in front of them,” Antrican said. “Things like pedestrians. Things like stop signs.”

Texting has already been a concern. As of January, 39 states and the District of Columbia have enacted texting bans. Ohio’s ban went into effect in August. But only 10 states, plus Washington, D.C., have banned hand-held cellular phone use. AAA states that those bans should go even further.

“We have long thought, and this is a commonly held perception, that hands free is safer,” Antrican said. “I think we just proved that that perception is not accurate.

Ohio State Highway Patrol spokeswoman Lt. Anne Ralston said that operating a vehicle is a complex behavior, requiring cognitive and physical abilities to be in sync with each other. Any distraction, “whether it’s daydreaming, texting, using your hands-free version or talking to someone in the car” can affect those abilities, particularly when there are several of them, Ralston said.

“Limit those distractions, put the phone down, keep your mind focused on operating that vehicle safely,” Ralston said. “Make that phone call or send that text message when you get to your destination safely.”


As Nellis grounds aircraft, training goes virtual

by Staff Sgt. Gregory Brook

99th Air Base Wing Public Affairs


6/10/2013 – NELLIS AIR FORCE BASE, Nev. — The skies over southern Nevada are quieter than they have been in quite some time due to the June 1 Air Combat Command directed stand down of flying operations.

Despite the stand down, the 64th Aggressor Squadron remains committed to accomplishing their mission, said Lt. Col. Michael Shepherd, 64th AGRS academic assistant director of operations.

“Our motto is ‘know, teach and replicate,'” Shepherd said. “As Aggressors, we are subject matter experts in a field of adversary tactics or systems anywhere from airplanes to missiles to actual tactics to electronic attacks.”

The 64th AGRS is assigned to the 57th Adversary Tactics Group at Nellis Air Force Base, Nev. Their primary mission is to provide support to the U.S. Air Force Weapons School, Red Flag exercises, the various test and evaluation squadrons and to provide training to units in the Combat Air Force on adversary tactics.

The 64th AGRS plans to focus on learning as much as possible about adversary tactics and providing training to the rest of the CAF due to the reduction in flight hours.

“We will continue to seek out the most current intelligence and update our briefs so we can disseminate that information to the CAF as much as we can,” Shepherd said. “By no stretch of the imagination are we just shutting down our squadron.”

There are plans in place for the 64th AGRS to work together as a team with other Nellis units to gain valuable insight and experience, Shepherd said. The U.S. Air Force Weapons School’s 16th Weapons Squadron, which teaches the F-16 Fighting Falcon weapons course, will work together with the 64th AGRS.

There is an academic agenda in place for the summer, Shepherd said. The 16th WPS will refresh the 64th AGRS on the tactics currently employed by the CAF and the 64th AGRS will reciprocate by teaching the 16th WPS the Aggressors’ academics. Every week a member of the 64th AGRS will certify as a subject matter expert in one of their assigned adversary equipment and tactics categories.

“We have a robust schedule throughout the summer. We will still be busy; it will just be a different kind of busy than we have become used to,” Shepherd said.


The pilots of the 64th AGRS will also try to stay current in their flight ratings using flight simulators and limited flying in support of the 422nd Operational Test and Evaluation Squadron, according to Shepherd. Many pilots will lose most of their currencies and will have to regain them in October. Simulators will be used to try and mitigate the lack of practice.

The 64th AGRS pilots will concentrate on practicing emergency procedures in the simulators.

“They cannot afford to lose proficiency in carrying out those procedures because they need to be able to fly and land safely,” Shepherd said.

There are few flight currencies that can be updated in simulators.

“While we can’t officially update them, we can still use the simulators to stay proficient,” Shepherd said.

“In order to keep ourselves sharp and keep our mindset, we have adopted a kind of back to basics mentality, said Capt. Paul Anderson, 64th Aggressor Squadron B-flight commander. “We just went over and did the first run in the simulators today to establish how we want to use them. It was really good; we got to see some of the challenges others are facing and learn how to better challenge our customers and improve their learning and training.”

64th AGRS pilots are flying against recent adversary tactics in simulators to gain a greater perspective of what CAF units’ experience in simulators as well.

“It validates our credibility as Aggressors,” Shepherd said. “We have to stay as current in the tactics and knowledge of things as we possibly can and of executing our mission in new ways.”

“The ideal way to train is to combine academics with flying, but since we are not flying we are offering as much academics as we can,” Anderson said. “You are able to talk about a threat and then go and see how it is actually employed. It drives home the point.”

In addition to staying professionally proficient in their own fields and teaching others through traditional in-person briefings, the 64th AGRS will use innovation and technology to accomplish their mission.

“The technology gives us a greater ability to get the information out there,” Shepherd said. “We have something called Virtual Flag. It’s like Red Flag, but in simulators. Everyone taps in, and we can be Aggressors in our simulators here. We can fly against the guys in [Royal Air Force] Lakenheath, England. I don’t think simulator training will ever be able to fully encompass what you get in real life, but it is good training.”

The goal is to continue to give the CAF the training needed to go out and fly, fight and win without sacrificing safety or airmanship, according to 64th AGRS leadership.

“One of my best directors of operations, when we were getting ready to deploy to Iraq, his mantra was embrace the pain,” Shepherd said. “I think that’s very relevant. It’s hard when you deploy, and it’s hard when you suddenly have all of your flying hours taken from you. Embrace it and make a positive out of it whatever way you can.”

“Know, teach and replicate,” Anderson said. “We are still trying to carry out our mission.”





Wave of federal retirees to hit government

By Jennifer Liberto @CNNMoney June 13, 2013: 6:29 AM ET



The U.S. government could soon be facing a shortage of workers.

Some 30% of the federal workforce will be eligible to retire in the next three years, according to a Government Accountability Office report. That means the government could be hit by a wave of retirements at a time when it is already dealing with inadequate staffing, because large numbers of employees are on furlough and hiring freezes are in place to save money and dramatically shrink the federal budget deficit.

It could lead to disruptions in some key areas. Some 46% of air traffic controllers can retire in the next three years, creating a challenge for the government to replace them with similarly experienced workers. It’s a worry, especially at a time when the public has fresh memories of long delays at the nation’s airports when the sequester kept controllers at home on furlough.

Also, more than a third of all federal veterinarians are eligible to retire. An impending national veterinarian shortage is already fueling fears that the U.S. Department of Agriculture will not be able to easily tamp the spread of a severe disease outbreak in animals, according to several federal reports.

Congressional watchdogs have been warning for years about this “tsunami” of baby boomers hitting retirement age at federal agencies. So far, it hasn’t happened because more workers hung on to their jobs far longer than expected, in part due to the recession.

That may be changing this year. In the first four months of 2013, some 60,000 employees filed for federal retirement benefits, a 43% increase from last year, according to the Office of Personnel Management. A major part of the spike comes from postal workers taking early retirement buyouts, OPM says.

But union groups and employee advocates say the uptick in retirements is being fueled by the $85 billion in spending cuts, along with its furloughs, pay freezes, slashed overtime and increased work load from the lack of new hires.

“We’re seeing this massive brain drain as thousands of workers leave the federal workforce, and it’s leading to a huge loss of knowledge and expertise,” said William Dougan, president of the National Federation of Federal Employees. “A lot of these jobs are not being filled once they’re vacated.”

It’s already leading to questionable outcomes. In the past two years, the beleaguered Internal Revenue Service has lost 10,000 employees to retirement and attrition. Some have suggested that the lack of senior level employees may have played a role in the current scandal plaguing that agency.

More than a third of IRS employees will soon be eligible to retire, which could also lead to longer waits for things like taxpayer refunds and audits.

Federal employees can retire at the age of 62 if they have worked five years, or even younger with 20 years of service. The average federal worker is 47 years old with about 14 years on the job, compared to the median age of 42 for all American workers, according to federal data.

Over at the USDA, the labor crunch poses a risk to the nation’s food supply as early as this year, according to Michael Gilsdorf, executive vice president for the National Association of Federal Veterinarians.

At least three USDA agencies are struggling to fill positions left by outgoing vets since federal budget cuts took hold, Gilsdorf said. That has shrunk the number of “deployable” veterinarians, who can respond to an animal disease outbreak from 1,000 vets last year, to about 250 now.

A federal advisory panel has warned there aren’t enough federal veterinarians to contain a massive multi-state outbreak of foot and mouth disease, a highly contagious animal disease which strikes cows, goats and pigs.

“Our capacity to respond to a disease outbreak has hit a critical stage,” said Gilsdorf, who runs the panel tasked with helping craft the plan to recruit thousands of private and public sector veterinarians to fill in gaps if disease hits the nation’s food supply.

The USDA responded by saying it has recruited 955 veterinarians and 3,100 animal health technicians to join a volunteer group of emergency responders called the “National Animal Health Emergency Response Corps.” The agency’s has a special program that trains vets, including those at nonprofits and in private practice on how to respond to a national emergency.

A USDA spokesperson said the agency prioritizes “working with staff and accredited veterinarians to ensure that we can respond quickly and in a manner that is highly protective of our animal agriculture resources.”

For some, however, a shrinking federal workforce is not something to worry about. In fact, it is a good thing — it’s a key part of Republican budget chief Rep. Paul Ryan’s plan to shrink deficits. This year, Ryan proposed cutting the 2.1 million workforce by 10% through a two-year hiring freeze.

An OPM official said it’s prepared for the wave of retirements by “constantly evaluating workforce preparedness programs.”

While sequester has been an “added challenge,” the official stressed “the decision to retire is a very personal one, as each person looks at their individual situation and decides what is best for them.”


Obamacare: Is a $2,000 deductible ‘affordable?’

By Tami Luhby @Luhby June 13, 2013: 6:23 AM ET


Until now, much of the debate swirling around Obamacare has focused on the cost of premiums in the state-based health insurance exchanges.

But what will enrollees actually get for that monthly charge?





States are starting to roll out details about the exchanges, providing a look at just how affordable coverage under the Affordable Care Act will be. Some potential participants may be surprised at the figures: $2,000 deductibles, $45 primary care visit co-pays, and $250 emergency room tabs.

Those are just some of the charges enrollees will incur in a silver-level plan in California, which recently unveiled an overview of the benefits and charges associated with its exchange. That’s on top of the $321 average monthly premium.

For some, this will be great news since it will allow them to see the doctor without breaking the bank. But others may not want to shell out a few thousand bucks in addition to a monthly premium.

“The hardest question is will it be a good deal and will consumers be able to afford it,” said Marian Mulkey, director of the health reform initiative at the California Healthcare Foundation. “The jury is still out. It depends on their circumstances.”

A quick refresher on Obamacare: People who don’t have affordable health insurance through their employers will be able to sign up for coverage through state-based exchanges. Enrollment is set to begin in October, with coverage taking effect in January. You must have some form of coverage next year, or you will face annual penalties of $95 or 1% of family income (whichever is greater) initially and more in subsequent years.


Each state will offer four levels of coverage: platinum, gold, silver and bronze. Platinum plans come with the highest premiums, but lowest out-of-pocket expenses, while bronze plans carry lower monthly charges but require more cost-sharing. Gold and silver fall in the middle.


The federal government will offer premium subsidies to those with incomes of up to four times the federal poverty level. This year, that’s $45,960 for an individual or $94,200 for a family of four. There will be additional help to cover out-of-pocket expenses for those earning less than 250% of the poverty line: $28,725 for a single person and $58,875 for a family of four. The subsidies are tied to the cost of the state’s silver level plans.


Related: I’m signing up for Obamacare


California offers insight into how much participants will actually have to pay under Obamacare. The state, unlike most others, is requiring insurers to offer a standard set of benefits and charges in each plan level. The only variables are monthly premiums, doctor networks and carriers in your area.


For those in need of frequent medical care, the platinum or gold plans would reduce out-of-pocket costs for treatment. These plans have no deductible, and doctors’ visits and medication are cheaper. But the trade-off is that they have higher monthly premiums. California has not yet released the premium range for these tiers.


On the flip side, a young man who never visits the doctor and wants to minimize his monthly charge could opt for a bronze plan. A 40-year-old enrolling in this plan could pay as little as $219 a month. But, if he did get sick, he’d get socked with a $5,000 deductible, $60 co-pays for primary care visits and a $300 emergency room charge.



Obamacare provides protection for those who need a lot of care by placing a cap on out-of-pocket expenses. The maximum a person in an individual platinum plan will spend a year is $4,000, while those in the other tiers will shell out no more than $6,400.
“Insurance is expensive. It’s hard for anyone who isn’t well off to afford it,” said Gary Claxton, director of the health care marketplace project at the Kaiser Family Foundation. “But it is good enough that you can afford to get sick without bankrupting yourself.”

Whether potential enrollees find these plans affordable will depend on how healthy they are and whether they are currently insured.

Many individual insurance offerings currently available come with much higher deductibles, cover fewer expenses and limits on how much they’ll pay out in a year. Plans on the exchange, on the other hand, are required to cover a variety of “essential benefits,” including maternity care, mental health services and medication.

“In many cases, depending on the plan, the coverage will be more comprehensive than what the enrollee currently has,” said Anne Gonzalez, a spokeswoman with Covered California, which is running the state’s exchange. To top of page




How the NSA Could Get So Smart So Fast

Modern Computing Is Helping Companies and Governments Accurately Parse Vast Amounts of Data in a Matter of Minutes.

Updated June 12, 2013, 7:51 p.m. ET




Five years ago it would have been unimaginable for a government agency such as the National Security Agency to efficiently parse millions of phone, text and online conversations for keywords that could have warned of an impending terrorist attack. Today, a set of new technologies make it relatively affordable and manageable for it do so.

Five years ago it would have been unimaginable for a government agency such as the National Security Agency to efficiently parse millions of phone, text and online conversations for keywords that could have warned of an impending terrorist attack. Today, it’s much easier. Michael Hickins joins the News Hub. Photo: AP

These technologies can store vastly different types of data in a single database, and can be processed rapidly using inexpensive hardware, without an analyst having to formulate a hypothesis. “They’ve substantially reduced the cost and greatly increased the [government's] ability to analyze this type of data,” says Tom Davenport, an expert on analytics and a visiting professor at Harvard Business School. The technology needed to outfit data centers to perform these tasks has become “orders of magnitude” less expensive than in the past, he said.

It is unclear exactly what type of computing the NSA is using in its data-center facilities around the U.S., or in a $1.2 billion facility in Utah that will open this fall.

But broadly speaking, the technology can be broken down into three categories:


Database systems

Traditional databases, usually written in a language known as SQL (pronounced sequel), store data in tables, columns and rows but are limited when it comes to storing strings of words such as those found in an email or text message. They also can’t handle pictures or video.

New types of databases that emerged beginning in late 2009, known collectively as NoSQL (for “not only SQL”), such as MongoDB, Cassandra and Simple DB, don’t have these limitations, and allow analysts to create queries against all these types of data.

NoSQL databases can make a huge difference to companies analyzing very large data sets, even if they’re fairly conventional. For example, analysts at risk consultancy Verisk Analytics Inc. are “constantly running different models and analytics” against billions of customer records in order to help identify fraudulent insurance claims.

Perry Rotella, vice president and chief information officer at Verisk, says using a traditional DB2 database from International Business Machines Corp., “would be a six-hour job” that had to run overnight. Analysts would pore over the results and generate new queries that would again have to run overnight. He said it took weeks every time analysts needed to create a new statistical model. The company recently changed to a NoSQL database that allows analysts to run the same types of queries in 30 seconds.


“So all of a sudden your model-building becomes iterative in real-time instead of over days. [Using NoSQL], you can run analytics on your data multiple times a day, and it compresses your ability to get results from weeks into days. It’s extremely powerful,” he said.

For online businesses like photography marketplace Shutterstock Inc., which store a great variety of file types, it is difficult to imagine life without this technology. Shutterstock has a library of more than 24 million images and adds an additional 10,000 each day, each of which has associated data to help narrow search results. Its databases also record everything that users do on the site—not just decisive actions such as what images they license, but also minute details such as where they place their cursor and how long they hover there.


Machine learning

Traditional analysis requires analysts to have enough understanding of the data to form a hypothesis and then create complex queries to run against the database. Recently developed programs known as machine learning and natural language processing rely on the computer programs themselves to find patterns and even elucidate the meaning of ambiguous words based on context. “You can turn a machine-learning program loose on a lot of data and you can see what they are able to be predictive of,” said Mr. Davenport. With natural language processing, “you could figure out whether a term like ‘bomb’ is being used to describe a Broadway play versus something a terrorist would use,” he said.

Machine learning, also known as cognitive analytics, allows queries to continually “tune themselves,” Gartner Inc. analyst Douglas Laney explains. For example, retailers use this technology to automatically update pricing algorithms in real time as new information, such as weather, time of day and even information gleaned from video of customers browsing in their stores become available. “It used to take more than a day to update pricing, but these retailers can reprice every hour and use trending information to do real-time product pricing,” says Mr. Laney. “I’m not sure they could do that even a year ago,” he said.



Until recently, complex computer programs needed to run on expensive hardware, such as enormous mainframe computers. Today, an open-source software framework called Hadoop—developed at Yahoo Inc. with contributions from technology developed by Google Inc. and named after a child’s toy elephant—allows queries to be split up by the program, with different analytic tasks distributed among scads of inexpensive servers, each of which solves a part of the puzzle, before reassembling the queries when the work is completed. “It’s really cheap and really fast,” said Mr. Davenport.


The ability to distribute complex queries to a large number of inexpensive computers helps people get very quick responses to complicated questions with a large number of variables. For example, online automotive market Inc. can help auto dealers predict how long a given car will remain on their lots by comparing car makes, models and trim against the number of days inventory cars at that price point averaged on a lot in a given dealer’s region. The predictions help minimize the number days a car remains unsold—”one of the most important sales metrics for dealers,” said Philip Potloff,’s chief information officer.

Video-streaming company Netflix Inc. uses Hadoop to graph traffic for every type of device people are using to access video across multiple markets, allowing the company to improve the reliability of video feeds on mobile devices, laptops and TVs, and plan for future growth of streaming movies and TV shows. It also helps Netflix to better analyze customer preferences so that it can make improved recommendations.


Pentagon Preparing Two Budget Plans for Fiscal 2014

By Kedar Pavgi

June 12, 2013


The Pentagon is planning for two scenarios in fiscal 2014 because the outcome of budget negotiations between the White House and Congress remains uncertain, a senior Defense Department official said on Wednesday.

Deputy Defense Secretary Ashton Carter said the department was “prepared for a wide range of budget contingencies,” adding that the department’s approach next year would depend on how Washington sorted out its fiscal situation.

One plan will be based on the Obama administration’s fiscal 2014 budget request and the other assumes Washington continues to be mired in “fiscal gridlock,” Carter said during an annual conference hosted by the Center for a New American Security.

Carter said the Pentagon’s Strategic Choices and Management Review, the budget evaluation ordered by Defense Secretary Chuck Hagel in March, had been completed and awaited review by Hagel’s office. The SCMR will be the “basis” for the fiscal 2015 budget request, and will also help with fiscal 2014 budget decisions, Carter said. A recent memo from his office directed Defense managers to plan for a possible 10 percent budget cut in fiscal 2014.

Still, Carter lamented the impact that sequestration was having on Defense’s budget and planning efforts.

“Despite our best efforts to minimize this damage, it is at a minimum, embarrassing, to be doing this in the eyes of friends and foes alike,” Carter said.

Defense must complete and submit a plan to make good on a promise to the leadership of the Senate Armed Services Committee. In May, Sens. Carl Levin, D-Mich., and Jim Inhofe, R-Okla., wrote to Hagel demanding a “package of reductions” by early July to help plan for possible sequestration cuts in fiscal 2014.

Carter also said he “aligned” himself with the recent shadow budget reviews conducted by CNAS and a host of other Washington D.C. think tanks. The organizations said that sharp reductions to the department’s civilian personnel would be necessary to help realign resources in areas such as the Asia Pacific and cybersecurity.


USAF awards Logos Technologies a $49.7 million contract to ‘tame Big Data’


Wed, 2013-06-12 08:39 AM





Logos Technologies LLC has been awarded a $49,750,000 Small Business Innovative Research (SBIR) Phase III indefinite delivery/indefinite quantity (IDIQ) contract from the U.S. Air Force Research Laboratory Information Directorate (AFRL/RI) in Rome, NY.


The award has a five-year period of performance, says a Logos news release issued on June 11.


The contract will facilitate multi-sensor tracking and the rapid extraction of intelligence from large data sets, allowing operators to predict target set behavior.


The Processing Exploitation and Analysis of Large Data Sets (PEALDS) program will leverage the work of Logos Technologies in Intelligent Persistent Surveillance (IPS) to better collect, catalogue, process, exploit and disseminate different streams of data from different sensors, such as signals intelligence (SIGINT), full motion video cameras and wide-area surveillance systems.


“Intelligent Persistent Surveillance moves beyond sensors and focuses on getting useful information to the warfighter in real time,” said Greg Poe, CEO of Logos Technologies. “IPS relies on sophisticated tools and methodologies, including Activity-Based Intelligence, which looks at how events interact with each other.”


PEALDS is a Small Business Innovation Research Phase III program. The contract will build on earlier efforts, such as Vast Imagery Streaming Technology (VIST). The VIST software is used in the Logos-developed Kestrel system. Kestrel is the only operational wide-area surveillance system for tethered airships, says the release.


“We are constantly looking at how to improve the intelligence product of sensors,” said Frank Purdy, director of the national security services division at Logos Technologies. “Working with multiple collection systems and data sets provides the knowledge to move us from reactive to predictive analytics.”


In January, Logos Technologies formally entered into a Cooperative Research and Development Agreement (CRADA) with AFRL/RI entitled “Exploitation and Analysis of Various Large Data Sets.” Like the PEALDS program, the CRADA will help advance U.S. Air Force data exploitation and tracking capabilities.



Turkish police shoot down unmanned aircraft above protest

13 June 2013

By Matthew Schroyer


Over the past two weeks, thousands of protesters have rallied at Gezi in Turkey’s capital, Ankara, against the government’s plans to raze the public park and turn it into a commercial zone and military barracks. Police have responded with tear gas, water hoses, and batons. The sudden explosion of political dissatisfaction has shocked the usually peaceful country, along with much of the world.

One Turkish citizen happened to be equipped with a small, RC helicopter with video recording capabilities, which he flew above the heads of protesters and police to gather close-up, aerial footage. The man’s video, posted on Vimeo, shows groups of police attempting to disperse crowds with gas and pressurized water.

Protesters can be seen scrambling to find cover at Taskim Square, in Istanbul. A fire in the distance sends smoke into the air.

According to the unmanned helicopter’s pilot, police shot the aircraft from the sky, which was destroyed completely on impact with the ground. The pilot was unable to recover video of the incident, but was able to obtain video leading up to that point.

Unmanned aircraft systems, or UAS, more commonly called “drones” in the media, are increasingly being deployed by activists. Over the past three years, they have appeared over protests in Russia, Poland, Estonia, and Argentina. In February, 2012, a group of animal rights activists were recording a pigeon shooting event in South Carolina with their own unmanned aircraft, when it was shot down by hunters.


IEEE Spectrum

The Better Meat Substitute

Can new and improved meat analogues help us control our damaging desire for animal flesh?

by Glenn Zorpette

Posted 3 Jun 2013 | 20:18 GMT

In one typical minute on this earth, people kill at least 109 339 chickens, 2630 pigs, 922 sheep, 781 goats, and 557 cows, plus a very large number of ducks, horses, turkeys, and rabbits, as well as three camels. And one singularly unfortunate mule. Add it all up and every week well over a billion animals are slaughtered for food, according to the Food and Agriculture Organization of the United Nations.

You may or may not find the death of so many animals unsettling, but it’s hard for anyone to ignore the undisputed environmental and health consequences [PDF] of this massive meat consumption. They include higher rates of greenhouse-gas emissions, water pollution, deforestation, and biodiversity loss. And possibly colorectal cancer as well.

No rational person can doubt that the world would be a better place if a significant fraction of carnivores simply gave up meat. But nothing in the history of human forbearance suggests that’s ever going to happen.


So let’s consider a slightly less obvious countermeasure. That would be to replace some proportion of meat with something that looks, tastes, and feels like meat but whose production doesn’t require tens of billions of animals, many of them hormonally augmented, that are 1) occupying more than 26 percent of the planet’s land area; 2) eating valuable crops; 3) drinking extremely large quantities of water [PDF]; and 4) producing dizzying volumes [PDF] of the greenhouse gas methane. Although the retail costs of meat aren’t high, the environmental ones certainly are.

Vegetable substitutes textured to resemble cooked animal flesh have been around for decades. But they’ve never been very enticing—a fact borne out by the limp sales figures. In the Netherlands, sales of such replacements hit a peak of around 1 percent by volume in comparison with meat, according to a 2004 Dutch study. That bump in popularity occurred after one of a series of gruesome meat-safety crises hit Europe. In the United States, sales of meat substitutes in 2011 totaled US $277 million—which was around 0.2 percent of what was spent on meat, according to a report in The Wall Street Journal and data from the American Meat Institute.


Now, though, a new crop of greatly improved meat substitutes is becoming available in Europe and the United States. Some of the new products are from small start-ups that are taking on huge food conglomerates whose subsidiaries have dominated the market (albeit small) for meat substitutes.


The new products include Beyond Meat and Match in the United States and Plenti in the Netherlands. All are produced with technology or advice from university laboratories: Beyond Meat from the University of Missouri; Plenti from Wageningen University & Research Centre in the Netherlands; and Match Meat from the University of Illinois at Urbana-Champaign. That technology is shaking up this long-placid backwater of the food industry is underscored by the fact that another highly touted new company now quietly readying its first faux-meat offering is a Silicon Valley start-up, complete with a Stanford professor, Patrick O. Brown, at the helm.


The new generation of meat substitutes have a texture and mouthfeel that matches the real thing remarkably well. The older offerings don’t do that quite as convincingly. All of the new substitutes come with anecdotes about diners—and even professional food critics—believing they were eating real meat.


While researching this article I sampled Plenti, Match, and Beyond Meat. I found the three to be quite different from one another, and yet all were impressively meatlike [see “A Consumer Guide to Fake Meat.” They’re a big improvement over more traditional meat stand-ins, such as those available in the United States and Canada under the brand names Yves, Tofurky, and Boca. It’s possible I would have been fooled had they been sprung on me unawares (especially if accompanied by a couple of glasses of wine).


Every now and then technology and social trends align to offer something sublime as well as important—like making possible a kinder and more enlightened paradigm for the food-processing industry. The food conglomerates have taken a beating lately as news articles have described their long-standing propensity to churn out nutritionally bankrupt foodstuffs loaded with the unholy trinity of salt, sugar, and fat. And yet, it’s just possible that the technology these companies wield could be put to uses that are not only profitable but also beneficial to people’s health.

Historically, healthy food has been a niche that the big food companies were content to leave to small subsidiaries or scrappy independents. But times are changing. Hundreds of millions of aging baby boomers are reevaluating their diets after being treated for cancer, heart disease, or other illnesses. And meat is a big business—$160 billion in the United States in 2010, according to the American Meat Institute. If emerging techniques allowed a company to make meat substitutes that were good enough to capture even a few percent of that market, it would still mean the kind of business that a conglomerate could love.


“Say you swapped 5 percent of meat for vegetable protein,” says Vaclav Smil, an environmental scientist at the University of Manitoba, in Canada. “Nobody would even notice. But it would make a difference on this planet.”

Eat a hamburger and you’ve just used about 2500 liters [PDF] of water, or enough for 16 baths in an average tub. And for each kilogram of beef consumed, as many as 35 kilograms of carbon-dioxide-equivalent greenhouse gases are released—about the same as what comes from a reasonably efficient car driving almost 275 kilometers.

Think of livestock as a means of converting vegetable protein into animal protein. To make livestock grow fast, producers feed them a diet rich in legumes and grain, usually soy, corn, and wheat—all of which contain protein that people could otherwise be eating. For cows, the conversion process is particularly inefficient. In Europe, where beef cows are fed a diet high in soy, it takes about 8 kilograms of soy protein to produce 1 kilogram of beef protein. So it is eight times as efficient to get your protein from soy than from beef. And that’s before you account for the enormous water and energy use, greenhouse-gas emissions, deforestation, and the rest that come from raising so many animals.


Today, meat eating varies among developed nations, ranging from about 45 kg per person per year in Japan (not including seafood, which is particularly popular there) to about 125 kg per person per year in the United States. Much of that meat is produced in “factory farms,” in which tens of thousands of animals are raised. The confined feedlots you find there are sometimes awash in animal waste that often runs off into streams and rivers, although it is not supposed to.


Factory farms have driven meat prices down to historically low levels. In the United States, families spend an average of 9.8 percent of their disposable income on food, according to the U.S. Department of Agriculture. “People spend more on entertainment,” Smil says. “No country in human history has paid so little for food as a percentage of disposable income,” he adds.


Though it has plateaued in developed countries, per capita consumption of meat is on the rise globally and is expected to keep increasing for a couple of decades more. Worldwide, meat production was 110 million metric tons in 1975. By the end of 2012, the Food and Agriculture Organization estimates, it had risen to 302 million. Meat consumption will keep increasing for the foreseeable future, driven by growth in both population and disposable incomes in countries like China. Thus meat production could reach 465 million metric tons in 2050, the FAO estimates, when world population is predicted to exceed 9 billion [PDF] (it is a little over 7 billion today).


Smil marshals a barrage of statistics to support his contention that the world could indeed produce 465 million metric tons of meat per year. But the strain on the world’s freshwater supply, environment, climate, and health-care systems would be noticeable, to put it mildly.


A satisfying meat alternative could do enormous good. “We need sophisticated technology to achieve this,” says M.A.J.S. “Tiny” van Boekel, dean of education at Wageningen UR. “If you really want to achieve sustainability, you have to work on technology. People don’t expect that. To some, sustainability is a cow in a meadow and chickens running around. But that’s not how we are going to feed 9 billion people.”


Photo: Beyond MeatAll-Plant Kebab: State-of-the-art extrusion technology gives Beyond Meat the look and feel of chicken.


How do you make vegetable protein taste like animal protein? By using an industrial-age technology called extrusion.


You may have had your first experience with extrusion before you were out of diapers, as a toddler playing with a toy called the Play-Doh Fun Factory. You pushed Play-Doh through a die to make Play-Doh strips that had the cross section of a star or a circle. The basic technique was first used more than 200 years ago to make lead pipe, and then wire, and still later, spaghetti. Extruders now produce breakfast cereal, puffed snacks, and pet foods.


The use of vegetables, nuts, and grains to make meatlike foodstuffs goes back to the 19th century, at least. In Asia, some meat substitutes grew out of Buddhist traditions, which eschew the consumption of meat, particularly in a temple. But the enterprise took a quantum leap in the late 1950s and early 1960s, when the earliest U.S. patents were issued for the use of extrusion to make “meat analogues” from vegetable proteins. Many companies were involved, including Archer Daniels Midland, the extrusion-machine maker Wenger, A.E. Staley, Cargill, Miles Laboratories, and Worthington Foods.


There are dozens of meat replacements on the market in Europe and a similar number in North America. Different products make use of different kinds of proteins or combinations of proteins, including wheat gluten and a protein derived from a fungus. But the majority are based on soy protein, sometimes in combination with other vegetable proteins, which has been extruded to give it a fibrous, meatlike texture.


Jeroen Willemsen, a cofounder of the Dutch company Ojah, in Ochten, which makes the meat alternative Plenti, points to three generations of meat substitutes. The first was a “tofu generation,” based on the bean curd. These products didn’t taste much like meat but had reasonably high levels of protein. The second generation features ready-to-use items that resemble sausages, meatballs, hamburger patties, and the like. The key ingredient is texturized vegetable protein. It is produced by extrusion of a low-moisture precursor and then dried into flakes or granules. These are rehydrated and incorporated into the final, ready-to-use product, which is sold frozen or chilled.


The third generation, which includes Plenti and Beyond Meat, are a special category known in the food industry as high-moisture meat analogues, or HMMAs. They are designed to have the taste and mouthfeel of genuine muscle meats— chunks of chicken breast, a piece of smoked eel, or shredded pork. The new analogues are meatlike enough to be sold in much the same way as meat—refrigerated and intended for use in soups, sandwiches, chilies, burritos, and other dishes where sauces and seasonings will help enhance the meaty illusion.


Being a relatively recent development, HMMAs occupy just a thin slice of today’s meat-substitute market. Researchers extruded the first HMMAs in the 1980s, following pioneering work at the Protial research and development laboratory in Angers, France. But a flurry of patents in just the past seven years confirms that tinkering over the last decade has greatly improved the appeal of these foods.


Here’s the challenge, according to Ethan Brown, the founder of Beyond Meat: “The difficulty is in matching perfectly the tension between biting cleanly through the product, with just the right amount of push-back, and the fiber structure.”


Fortunately, to meet that challenge you have a marvel of modern industry, the twin-screw thermoplastic food extruder [PDF]. It precisely applies heat, pressure, and mechanical shear forces to a foodstuff, typically a floury mix rich in protein, to transform it in some way. The mixture moves through the machine in a flow, being altered continuously as it does so.


For an HMMA, the processing begins with a mix of protein powders derived from legumes and possibly some grains or seeds, along with starch, binding agents, flavors, and colors. They all go into a single hopper. From there, they move into the preconditioner or feed zone, where the protein powders are mixed with finely sprayed water and oil. The oil brings the mixture up to a fat proportion resembling that of meat, at least 4 or 5 percent. The gooey mixture is moved along by a pair of corotating, intermeshing steel augers. These mix and knead the powder, water, oil, and other ingredients into a dough while pushing it from the preconditioner into the extruder’s main barrel, a tube about 2 or 3 meters long, where the dough begins its transformation into a fibrous, meatlike substance.

As the augurs pull and push the dough through the extruder barrel, heat and pressure and shear forces alter the dough’s proteins, says Brian Plattner, a process engineer at Wenger Manufacturing. The heat comes from two sources: Some is generated mechanically by the spinning augers, and some is applied externally. For example, heat and moisture, in the form of steam, are often injected into the dough directly through the jackets surrounding the barrel.

The transformation to ersatz meat begins with the breaking of a certain type of chemical bond in the proteins, called a disulfide bond. These bonds link sulfur compounds in the protein molecules. It is largely the breaking and reassembling of those disulfide bonds that texturizes the vegetable protein to resemble meat, according to Fu-hung Hsieh, a professor of biological engineering at the University of Missouri. Hsieh developed the extrusion process—specifying the ingredients, temperatures, and so forth—on which Beyond Meat is based.

After their disulfide bonds are broken, the protein molecules are then realigned and linked into longer chains, based on new disulfide bonds, in a phenomenon similar to polymerization. Chemists refer to this mass linking as “plasticization,” which is why the technical name for the manufacturing process is “thermoplastic extrusion.”

For a high-moisture meat analogue, the final, or die, stage is all-important and requires a special die [PDF]. At 1 to 2 meters, it is longer than a conventional die, explains Mian N. Riaz, director of the Food Protein Research & Development Center at Texas A&M University. Coming from the barrel, the dough moves into the die at 125 to 130 °C, according to Plattner. The part of the dough that’s in direct contact with the die surface moves more slowly than the part in the center, giving rise to shear forces, which encourage the realignment of protein molecules. Meanwhile, coolant circulating through the die rapidly reduces the temperature of the meat-analogue-to-be. This shearing and fast cooling in the die chamber promotes lamination, in which adjacent long protein chains cross-link with each other. That cross-linking forms layers that greatly enhance the food’s fibrous texture. The cooling also prevents it from expanding and becoming puffed and porous, like a big meaty mega-Cheeto.

At the very end, just as it exits the die, the HMMA is shaped and cut into pieces. The relative proportions of moisture, fat, and protein in the final product can be set almost arbitrarily, but manufacturers generally make them close to those of lean meat: 60 to 70 percent moisture, 2 to 5 percent fat, and 10 to 15 percent protein. As with meat, the final product must be refrigerated or frozen to prevent spoilage.


Workers can flavor the HMMA while it is being made in the extruder or after it comes out of the die, or both. The Dutch product, Plenti, is sold unflavored and in bulk to retailers who flavor it and sell it to consumers.


The best modern extruder machines, with electronic sensors and actuators, offer precise control of the temperature, pressure, and shear forces, which all affect the characteristics of the HMMA. But the start-ups involved are, understandably, reluctant to give away their secret recipes. It seems they want to make as much progress as they can before they have to go head-to-head with the bigger players. Patrick Brown, the would-be mock-meat magnate, wrote in an e-mail: “Given the resources and political clout of the meat and dairy and Big Ag industries, I’m sure you can understand that we have more to lose than to gain by drawing further attention to our activities and plans.”


Still, details about some products have inevitably seeped out. Hsieh, whose breakthroughs led to Beyond Meat, was listed as coinventor on a recent U.S. patent that runs to 23 pages, including myriad details about temperature, pressure, pH, and lipid content. You can also find a video on YouTube that shows the “Missouri faux chicken” being made, in all its mock-meaty glory.


“Chicken” From an Extruder To make a high-moisture meat analogue, start by mixing plant protein powders with water and oil. Then knead the mix in an extruder barrel, and finally shear and cut in a die.


Given fake meat’s long history, why is it only now starting to rival the real thing? According to Gerry Hertzel, research coordinator for food products at Wenger, part of the answer can be found in improvements in the raw ingredients. The suitability, consistency, and variety of the soy and other vegetable protein powders from the Big Ag manufacturers, such as Archer Daniels Midland, Cargill, and CHS, has improved greatly, asserts Hertzel. “If you look at the soy isolates from 10 years ago and the ones today, there’s a night-and-day difference,” he says.


Hertzel’s colleague Plattner also points to recent improvements in the design of extruder dies that let operators very precisely control the rate of coolant flow, which in turn allows for consistent and uniform cooling. He believes that the pace of improvements will hold steady and bring even better products in coming years.


And extrusion isn’t the end of the line, technology-wise. At Wageningen UR, engineering professor Atze Jan van der Groot has invented a different kind of machine to process vegetable protein powders into a fibrous, meatlike product. Van der Groot began by studying the protein structure he wanted in the final product and then set about designing something that could reproduce it. The ingenious contraption that resulted uses a conical steel rotor that spins within a complementary funnel-shaped steel vessel. The rotor crushes, shears, and heats vegetable-protein powders. “We can create a wider range of textures,” van der Groot says, while demonstrating his novel food processor. “I think it will eventually be cheaper than extrusion. It’s quite simple and robust.” He is now working with engineers at Delft University of Technology, in the Netherlands, to scale up the system.

Today a lot of meat-analogue research in Europe is focused on foods other than soy. One of Germany’s renowned Fraunhofer Institutes is concentrating on lupine seeds and has produced sausages and “ice cream” that have received good reviews. In France, the food companies Sotexpro and Roquette have jointly introduced a pea-protein-based substance meant to be added to meat to extend it. Studies have indicated that meat can be supplemented by up to 30 percent with extenders before people begin to notice any difference. Already, in the United States, soy protein makes up about 30 percent by weight of the “meat” served in the National School Lunch Program. Vaclav Smil says that extenders are the surest way to start displacing meat on a large scale.


But the more important challenge now is to make a cheaper meat alternative, not a better one. Plenti and Beyond Meat generally cost about twice as much as real chicken. With greater economies of scale, the costs would undoubtedly go down.


Ethan Brown, the founder of Beyond Meat, wants his company to be huge, and not just for the obvious reasons. “If all we end up doing is serving the wealthy, we’d be upset,” he says. “Here’s how I’ll know when we’ve succeeded: when you can go into a Burger King or a McDonald’s and buy a plant-based nugget.”

This article originally appeared in print as “Muscling Out Meat.”


Adventures in Printing Food

3-D kitchen printers produce hits (a deep-fried scallop space shuttle) and misses (square milk)

IEEE Spectrum

By Jeffrey Lipton & Hod Lipson

Posted 31 May 2013 | 13:11 GMT

You want to bake a special cake for your mom, so you boot up the 3-D printer in your kitchen. Loaded with a dozen cartridges filled with pastes of chocolate, marzipan, and other ingredients, the machine downloads instructions from the Internet. You key in a specific texture, size, and flavor, and then you insert a 3-D message in the center—Happy Birthday, Mom!—to be revealed only after she takes a bite. The machine does the rest, assembling and baking a pastry so scrumptious it rivals a virtuoso chef’s in richness and complexity. Your mother loves it so much that she insists you send a slice of the same cake—in the form of a digital recipe—to your Uncle Norman’s printer as well. Your 3-D cake recipe gets so many raves, in fact, that you decide to sell the recipe online for 99 cents a download.


Science fiction? Hardly. The technology exists, and over the last eight years people have cooked up all sorts of comestibles with it, some a lot stranger than a cake with printing inside.

Let’s start with the printer. Versions of these machines, which follow an electronic blueprint to create 3-D objects out of layers of different material, have been around for nearly three decades. In the late 1980s, they were van-sized behemoths used in industrial settings for prototyping or for producing small batches of aerospace and medical parts. Today’s consumer models, by contrast, are about the size of a microwave oven and may sell for about US $1000. Between then and now, a whole community of do-it-yourselfers has emerged, eager to exploit the amazing capabilities of these versatile gadgets.

A high school student was among the first to use a Fab@Home 3-D printer to print edible objects, like these M-shaped chocolate pieces.

The Fab@Home project began in 2005 to help make the technology accessible to regular folks, and the following year came out with the first open-source DIY 3-D printer. Like most 3-D printers, the Fab@Home system uses a robotic arm with an extrusion head to deposit soft or liquid materials that later harden. While many home 3-D printers use spools of plastic as feedstock, our printer relies on syringelike cartridges that can hold a variety of different pastes and fluids. Researchers have used our machine to print artificial ears from living cells and to build working batteries and actuators from a variety of conductive and nonconductive materials. But what has really fired up the imaginations of DIYers is an application we never foresaw: printing food.

Noy Schaal, a high school student in Louisville, Ky., was one of the first people to use the Fab@Home at home rather than in a research lab. After getting the machine in 2006 she immediately modified it to work with her material of choice: chocolate. Getting the temperature just right took a while, but her printer ended up winning first prize in a local science fair, where Schaal printed chocolate letters, textured bars, and other shapes directly from a computer-aided-design (CAD) model and then handed them to the judges.

Other groups caught on to the printer’s culinary potential, and by the end of that year we at Fab@Home had begun experimenting, too. For starters, we printed hummus and peanut butter in every shape we could render on a CAD system. When the Fab@Home project won the 2007 Popular Mechanics Breakthrough award, the team printed out hors d’oeuvres, made of Brie and apricot comfiture, at the award reception. But our prized creation was—and still is—a space shuttle made from Cheez Whiz. For more than two years, it’s been sitting on a shelf in our lab, unrefrigerated. (Frankly, we’re afraid to eat it now.)

Those early attempts were all made from simple pastes that hardened when dried or cooled. But while a paste-based diet may have sufficed for the early astronauts, it’s too limited for most people. For digital cooking to really catch on, we concluded, the printers needed to accommodate a larger range of recipes, ingredients, and cooking temperatures.

Getting the printers to operate at the right temperatures for different types of food is not easy. Food, unlike plastic, can change dramatically over a relatively short period of time: A batch of frosting made in the morning may work fine at one temperature, but the same batch later in the day may not. Now consider the huge array of possible ingredients and the different settings that each would need, and you can see why creating a truly useful home food printer seemed at first impossible. Then Cornell University graduate student Daniel Cohen had an idea.

What was needed, he thought, was the equivalent of an RGB standard for food. RGB stands for red, green, and blue, the basic color elements used in televisions to reproduce a rainbow of colors; a similar set of basic colors—cyan, magenta, and yellow—are used in inkjet printers. Cohen’s idea was to create a similarly standard set of elements for the food printer that would make it simpler to produce a variety of foods—and also allow you to share your designs, so that you could “send” a piece of cake to your uncle’s printer.

With Cohen and undergraduate students from Cornell’s school of hotel administration, we began to look for these few printable ingredients that could be used to build many different food types. We didn’t have to look far. A huge industry already exists to devise food flavors and colors that can make just about anything look and taste like something else. Supplements like vitamins, minerals, and fibers are also widely available.

The only problem, then, was getting the right texture. For that we turned to hydrocolloids—materials like carrageenan, xanthan gum, and gum arabic—that today appear on many food labels. They’re the thickeners in McDonald’s milkshakes, for instance. We brought in other gelling agents like those used in Jell-O desserts. We were already familiar with some of these substances, having used them to help print living cells. This time, we mixed the gels and gumming agents with other ingredients and then put them through our printer to create edible constructs like cubes of milk, raspberry domes, and mushroom-shaped bananas.

While these recipes demonstrated Cohen’s principle, they were also a little too weird. Offer someone a plate of banana mushrooms and milk cubes and you’ve entered the uncanny valley of food, where nothing feels quite right and everything screams “artificial.” Most home cooks aren’t ready to go there just yet.

Some researchers do see a future, however, in digitally designing food from basic flavors and supplements, in large part because it could be a more efficient way to produce nutritious but otherwise expensive food. Researchers at TNO (the Netherlands Organisation for Applied Scientific Research), are extracting basic carbohydrates, proteins, and nutrients from algae, insects, and the like and then using them to print something resembling steak and chicken. Eventually, this may allow them to print a filet mignon from a protein that requires far less water, energy, and labor than does a cow. TNO isn’t the only place exploring this realm. Susana Soares at London South Bank University has used a flour made from crushed bugs to print edible objects that look like butterfly wings and honeycombs.

While this approach could someday solve the Malthusian concerns of food production, it’s a hard idea to swallow. The trend these days is to back away from highly processed foods. Last year a number of huge meat-packing operations were shuttered after news about “pink slime,” a processed meat paste added to ground beef, hit the media. Who would want to risk their business on a pink-slime machine, especially if that slime comes from bugs?


Instead of designing foods from basic materials—from the bottom up—we’ve recently turned toward a top-down approach. That is, we’re taking existing foods and modifying them to make them printable. The idea came to one of us (Lipton) while flying back to New York after presenting a talk. While flipping through the in-flight magazine, he learned that David Arnold, a world-class chef, wanted to get his hands on a 3-D printer. We quickly arranged to send him a Fab@Home printer. In the first 24 hours of our collaboration with Arnold, we made deep-fried scallops shaped like space shuttles and sculptures made out of turkey with celery centers. Using the printer to creatively customize food shapes, we discovered, is a lot more appealing than crafting milk cubes out of hydrocolloids.

Inspired by Arnold, we then set out to push food printing even further. After all, anyone can use a mold or cookie cutter to shape food, but only a 3-D printer can easily create internal designs and intricate sculptures. We started with a recipe we got from Franz Nigl, a visiting scientist from the Pacific Northwest National Laboratory, whose Austrian grandmother’s Christmas cookies were notable for holding their shape when baked. We made batch after batch of the dough, cramming it into the printer’s cartridges and fine-tuning the recipes and the machine. We then programmed the printer to etch a message onto the top of each cookie, and eventually, we created a cookie that had writing inside it.

But message or no, a cookie is still a cookie. Our next stop was the International Culinary Center in New York City, where in January 2011 we began experimenting with food that could be made only by using a 3-D printer and that would be unlike any food we had ever eaten before, yet similar enough to avoid the “ick” factor. The result? A new form of fried corn dough. Now in a world in which state fairs offer up deep-fried Twinkies, you would think that there would be few frontiers left for fried or corn-based foods. But the 3-D printer has opened up entirely new ways of modifying textures. By printing meandering streams of our corn dough, we created a porous matrix that allowed the frying oil to penetrate much deeper into the food. The result was something delicately crispy and greasy, like a cross between a doughnut, a tortilla chip, and raw ramen noodles.

Our food explorations continue. Digital cooking is still a nascent field, but we’re amazed at how much progress has already been made: From those humble peanut butter, hummus, and chocolate objects, it has already morphed into a movement that could someday transform how we prepare and consume food. While some people believe the future of printed food will begin at the chemical level, others think it will become a common tool to augment the molds, knives, and ovens we already have. Regardless, both camps agree that the information age’s transformations have started making kitchen magic.

And once you get started, it’s hard to stop. To reward ourselves for finishing this article, we went back into the lab and printed ourselves a couple of cookies. To do that we ran software that considered our scheduled activities for the day, our food intake, and our individual heights and weights, and we then programmed our food printer to layer both sugar-free and sugar-rich dough to create a cookie that had just the right number of calories to fill out our calorie deficits for the day.



About the Author

Jeffrey Lipton was the project lead of Fab@Home, one of the first fully open-source 3-D printers, which in 2006 helped moved 3-D printers into the consumer market in general and into food printing in particular. Lipton is now chief technology officer of Seraph Robotics, a company that makes Fab@Home printers and parts; he’s also a Ph.D. student in Cornell University’s Creative Machines Lab. Hod Lipson is a professor at Cornell and the Creative Machines Lab and is coauthor of the book Fabricated: The New World of 3D Printing (John Wiley & Sons, 2013). When not making or eating printed food, Lipton and Lipson have a passion for advanced manufacturing and for as-yet-unprintable gastronomic miracles like doughnut burgers.


NSA Won’t Jettison Contractors, Yet

Agency to Scrutinize Its Security Mechanisms

By Eric Chabrow, June 13, 2013


The director of the National Security Agency declined to say that the electronic spy agency would stop using contractors in top secret IT positions to prevent a leak such as the one that exposed NSA programs to collect metadata on U.S. citizens.

Gen. Keith Alexander also announced at a Senate hearing on June 12 that the Defense Department will examine the oversight mechanisms in place that are designed to prevent all individuals with top secret clearances, not just contractors, from accessing information they’re not entitled to see.

Responding to a question on whether the NSA should stop using contractors in sensitive positions, Gen. Keith Alexander said, “I’m not prepared to make that statement, yet.”

The question was raised by Sen. John Boozman, R-Ark., because Edward Snowden, a 29-year-old systems administrator with top secret clearance working for government contractor Booz Allen Hamilton, leaked details about two top secret programs to collect metadata about telephone calls and Internet activities in an attempt to identify terrorists “There are good contractors out there who are doing a good job,” Alexander said during the Senate Appropriations Committee hearing.

The Role of Contractors

The federal government relies heavily on contractors in many fields because it doesn’t have the expertise on staff to meet its needs; that’s especially true in information technology and information security. Contractors go through the same security clearance process as do federal employees, and they take the same oath not to disclose government and military secrets. “Their paychecks just come from two different sources,” Evan Lesser, managing director of the jobs website, tells Information Security Media Group.

Lesser points out that Army Pfc. Bradley Manning, on trial for disclosing one-quarter million sensitive and secret diplomatic cables to WikiLeaks, was a government employee, not a contractor. In vetting individuals for top secret security clearances, the government looks at their finances; foreign travel; and habits, such as drug and alcohol use and gambling, that could compromise them.

“From a clearance standpoint, there’s not a whole lot in their background that is going to likely trip them up,” Lesser says. “The younger you are, the easier it is to get a clearance because you have less baggage, less history. In the case of these two, it doesn’t sound like there was anything in their past that would raise a red flag during the clearance process.”

At the hearing, Alexander also dismissed Snowden’s contention that the contractor could tap into virtually any American’s phone calls or e-mail. “I know no way to do that,” he said.


Lengthy Investigation Expected

The NSA director also cautioned that it would take considerable time for the Defense Department to examine the security mechanisms in place designed to prevent individuals with top secret clearances from accessing information they’re not entitled to see. He said the investigation will look at procedures for when to encrypt data.

“I don’t want to mislead you; this is a significant effort for the Defense Department,” he said, adding that he has the backing of Defense Secretary Chuck Hagel and Joint Chiefs of Staff Chairman Gen. Martin Dempsey. “We’re pushing this; this is the right way to go. I wish we could go back in time.”

The federal government uses a variety of tools that could identify the activities of employees. Those include keylogging software and computer logs that pinpoint staff members’ whereabouts and actions within federal IT systems and networks, sources familiar with the federal government’s security clearance systems say. But having the tools in place – and not all tools are used by all agencies at all times – doesn’t mean that the proper authorities are alerted in a timely manner to activities that could jeopardize the nation’s security.

Alexander, who also serves as commander of the U.S. Cyber Command, said the leak has caused great harm. “The consequence of this is that our security has been jeopardized,” Alexander said. “There is no doubt in my mind that we will lose capabilities as a result of this, and not only the United States but those allies that we have helped will no longer be as safe as they were two weeks ago.”

‘Americans Will Die’

Alexander said the surveillance programs prevented dozens of possible terrorist events in the U.S. and abroad, but he declined to identify them for security reasons.

Because the programs have been disclosed, however, Alexander said it’s incumbent for political and military leaders to explain the benefits of the programs to the American people. And he said the NSA will be transparent, within limits, disclosing some information about the programs, but not everything. “Some of these are still going to be classified and should be because if we tell the terrorists every way we’re going to track them, they will get through and Americans will die.”

The End of the Military Industrial Complex

By DAVID FRANCIS, The Fiscal Times June 14, 2013


Defense Secretary Chuck Hagel told lawmakers Wednesday that the sequester, along with continued budget uncertainty, was making it difficult for the Pentagon to properly plan and prepare to confront 21st Century threats.

“We are now in a different fiscal environment dealing with new realities that will force us to more fully confront these tough and painful choices, and to make the reforms we need to put this Department on a path to sustain our military strength,” Hagel told the Senate budget committee. “The onset of these resource constraints has already led to significant and ongoing belt-tightening in military modernization, force structure, personnel costs, and overhead expenditures.”

Hagel said the Pentagon would be forced to make cuts to personnel benefits, reduce force size, and delay training of new recruits. But some of the most dramatic cuts are expected to occur in the acquisitions budget, which pays contractors for the machinery they produce and the services they provide.

Earlier in the year, there was concern that the $43.2 billion removed from DOD’s budget this year combined with an additional $600 billion over the next decade, would hamstring the broader economic recovery. As the chart below shows, DOD spending accounts for nearly 6 percent of gross domestic product.

Hubert van Tuyll, a professor of economics at Georgia Regents University and co-author of Castles, Battles, and Bombs: How Economics Explains Military History, said he was not concerned about DOD dragging down the broader economy.

“When you have a big defense drawdown, it doesn’t really hurt the overall economy that much. But when you have a drawdown, the military establishment has to make a decision to lose people or equipment,” van Tuyll told The Fiscal Times. “It’s the equipment that will go first.”

This is bad news for defense contractors. In the coming years, the massive post-9/11 military industrial complex will be forced to confront a series of challenges, from the drawdown to a poorly defined mission, that will eat away at their bottom lines. This will force defense contractors to make tough decisions regarding their futures their research and development budgets shrivel. 

Large and small defense contractors made money hand-over-fist after the September 2001 terrorist attacks. The chart below shows just how dramatic their revenue growth has been.

This sharp increase has led to massive profits. In 2002, the combined profits of the five largest U.S.-based DOD contractors – Lockheed Martin, Northrop Grumman, Boeing, Raytheon and General Dynamics – totaled an inflation-adjusted $2.4 billion. Ten years later, revenues for these five companies jumped 450 percent to $13.4 billion.

According to retired Army Colonel Doug Macgregor, these enormous profits are quickly becoming a thing of the past. “The traditional way of doing business … is not going to last. It’s not going to last because we can’t afford it.

The Pentagon has yet to specially outline how it will make funding decisions. According to van Tuyll, history illustrates how the military makes tough spending choices.  During the Great Depression, the Army was forced to cut costs that rose during the First World War.

“The U.S. army was faced with a similar problem to what they’re facing now,” he said. “Douglas MacArthur and George C. Marshall decided it takes much longer to train new officers than it does to make an airplane, a tank or a gun. So they cut equipment”

This strategy paid off during World War II. The officers needed to quickly train a fighting force capable of defeating Hitler’s war machine was already in place. The subsequent manufacturing push helped to end the Great Depression.

The contractors are also working against a shift from a large war model to one that uses smaller forces and drones to conduct security operations, Macgregor said. This works against contractors’ preferred way of doing business: big orders paid for by big contracts for big items like planes, tanks, and weapons.

In the coming years, Macgregor said that DOD is likely to give out small initial contracts first. If the Pentagon likes the result, it will order more. If not, it won’t pay a second contract.

“Contractors don’t like the idea of creating smaller numbers before big orders because they’re not going to get the kind of profits you did in the past. Everyone got rich off the old strategy,” he said.

Macgregor added that DOD’s failure to articulate how it plans to drawdown equipment allows these companies to continue to abuse the industrial age model. As long as the military keeps ordering unnecessary vehicles like the Abrams tank, contractors are making money by exploiting an outdated war model.

“Industry will begin to do what is has always done,” he said. “Everyone is going to squeeze out profits from whatever they got before they can’t squeeze any more.”

It’s not just the lack of large orders that will take a toll on contractors. Research and development dollars coming out of the Pentagon are likely to dry up as well. These grants allow companies to develop dual-use technology that produces two independent revenue streams. For instance, the Pentagon could pay Boeing to develop an engine for a cargo plane that could also be sold to a commercial airline.

“The American aviation industry got a head start after World War II through what was essentially a military subsidy,” Georgia Regent’s van Tuyll said. “If you learn how to build a good military engine, you can stick it on a commercial airplane.”

For now, defense stocks of major defense companies hold steady. But Macgregor believes all of these factors will eventually lead to job losses once the real bloodletting begins.

Jurgen Brauer, professor of economic at Georgia Regents University and co-author of Castles, Battles, and Bombs, said these companies needed to reinvent themselves. A prime example how a defense contractor can evolve is General Electric. The company made enormous profits and technological advances during World War II. Once the war was won, it began to invest in and acquire other businesses. Now it offers everything from financial management to refrigerators along with military jet engines.

“Last time around, after the end of the Cold War, we saw a huge consolidation (mergers & acquisition) among the major contractors,” Brauer, who is teaching overseas, wrote in an email to The Fiscal Times. “Unlike Europe, there isn’t much more to consolidate among the U.S. ‘majors’ so they may seek even more overseas contracts or, for those with a civilian market portfolio, are likely to shift more deliberately and clearly into nonmilitary work.”



June 14, 2013

Snowden’s Leaks on China Could Affect Its Role in His Fate




HONG KONG — The decision by a former National Security Agency contractor to divulge classified data about the U.S. government’s surveillance of computers in mainland China and Hong Kong has complicated his legal position, but may also make China’s security apparatus more interested in helping him stay here, law and security experts said on Friday.

The South China Morning Post, a local newspaper, reported on Friday that Edward J. Snowden, the contractor, had shared detailed data showing the dates and Internet Protocol addresses of specific computers in mainland China and Hong Kong that the National Security Agency penetrated over the last four years. The data also showed whether the agency was still breaking into these computers, the success rates for hacking and other operational information.

Mr. Snowden told the newspaper that the computers were in the civilian sector. But Western experts have long said that the dividing line between the civilian sector and the government is very blurry in China. State-owned or state-controlled enterprises still control much of the economy, and virtually all are run by Communist Party cadres who tend to rotate back and forth between government and corporate jobs every few years as part of elaborate career development procedures.

Kevin Egan, a former prosecutor here who has represented people fighting extradition to the United States, said that Mr. Snowden’s latest disclosures would make it harder for him to fight an expected request by the United States for him to be turned over to American law enforcement. “He’s digging his own grave with a very large spade,” he said.

But a person with longstanding ties to mainland Chinese military and intelligence agencies said that Mr. Snowden’s latest disclosures showed that he and his accumulated documents could be valuable to China, particularly if Mr. Snowden chooses to cooperate with mainland authorities.

“The idea is very tempting, but how do you do that, unless he defects,” said the person, who insisted on anonymity because of the diplomatic sensitivities in the case. “It all depends on his attitude.”

The person declined to comment on whether Chinese intelligence agencies would obtain copies of all of Mr. Snowden’s computer files anyway if he is arrested by the Hong Kong police pursuant to a warrant from the United States, where the Justice Department has already been reviewing possible charges against him.

A Hong Kong Police Force spokeswoman said earlier this week that any arrest would have to be carried out by the Hong Kong police and not by foreign law enforcement. The Hong Kong police have a responsibility to share with mainland China anything of intelligence value that they find during raids or seizures of evidence, according to law enforcement experts.


Patricia Ho, a lawyer who specializes in political asylum at Daly and Associates, a Hong Kong law firm, said that if Beijing decides that it wants Mr. Snowden to stay in Hong Kong for a long time, the simplest way to do so would be for mainland officials to quietly tell Hong Kong’s government officials not to hurry the legal process.


The United States and China have long accused each other of monitoring each other’s computer networks for national security reasons. The United States has also accused China of hacking to harvest technological secrets and commercial data on a broad scale from American companies and transferring that information to Chinese companies to give them a competitive advantage.


Tom Billington, an independent cybersecurity specialist in Washington, said that mainland China could benefit by obtaining a copy of the data that Mr. Snowden gave to the South China Morning Post. The data, if independently verified, could help Chinese officials figure out which computers have been hacked, patch security holes, itemize compromised data, analyze the quality of computer security defenses and develop techniques for hardening other Chinese computers against future surveillance by the N.S.A.


According to The Guardian newspaper of Britain, Mr. Snowden showed up with four laptop computers for a meeting with its journalists in Hong Kong. But the Los Angeles Times has reported that Mr. Snowden originally smuggled electronic files out of the National Security Agency in Hawaii using a USB thumb drive.


Simon Young, the director of the Centre for Comparative and Public Law at the University of Hong Kong, said in a statement that it would be a violation of Hong Kong law to disclose any information that had been shared confidentially by the Hong Kong or mainland Chinese governments with the United States.


“These recent developments underline the importance of Mr. Snowden obtaining immediate legal advice in Hong Kong, especially before any further disclosures are made,” Mr. Young said.


Mr. Young did not suggest whether any of the data shared by Mr. Snowden would fall into this category. But the Hong Kong government has a history of close law enforcement cooperation with the United States, particularly in the area of counterterrorism. The Hong Kong police have long focused on trying to prevent the territory’s freewheeling financial system from becoming a base for Al Qaeda-related money laundering.


The South China Morning Post said that one target of N.S.A. hacking identified by Mr. Snowden was the Chinese University of Hong Kong, which hosts the city’s main hub for Internet connections to the rest of the world. “The University has not detected any form of hacking to the network, which has been running normally,” the university said in a statement.


The newspaper said that it had not independently verified the accuracy of the data that Mr. Snowden provided. But the United States government has not questioned the authenticity of any of the documents he has released.


The Global Times, a nationalistic mainland Chinese newspaper under the direct control of the Communist Party, published an editorial on Friday calling for China to glean as much information as possible from Mr. Snowden.

“Snowden is a ‘card’ that China never expected,” the commentary said. “But China is neither adept at nor used to playing it.”

The commentary also called for China and Hong Kong to treat Mr. Snowden kindly enough so that others with national security secrets will not be discouraged from fleeing here. “China should make sure that Hong Kong is not the last place where other ‘Snowdens’ want to go,” it said.

The Associated Press reported on Friday that Britain had issued an alert to airlines around the world warning them not to bring Mr. Snowden to its soil, and threatening them with a fine of 2,000 pounds, or $3,125. Geoffrey Robertson, of London, who was an initial lawyer for Julian Assange during the WikiLeaks dispute, criticized the alert as unusual because it was being applied to someone who has denounced government policies.

“This is a power hitherto used only against those who incite terrorism, race hatred and homophobia — never before against whistle-blowers,” Mr. Robertson wrote in an e-mail. “The British government is simply afraid that its judges, who are fiercely independent, and the European court would embarrass its closest ally by ruling that Snowden could not be extradited because, even if his “revelations” prove to be mistaken, he would be subjected to oppressive treatment akin to that being meted out to Bradley Manning,” the American Army private accused of having leaked secrets in the WikiLeaks case.


Rasmussen Reports

What They Told Us: Reviewing Last Week’s Key Polls

Bottom of Form

Saturday, June 15, 2013


The news that the National Security Agency is monitoring everybody’s phone records and reading emails dominated the news last week. Most voters (57%) fear other government agencies will use the NSA data to harass political opponents. Just 26% support the government monitoring effort

President Obama, whose deeply held faith in government is unwavering, dismissed the concerns as “hype.” He added, “If people can’t trust not only the executive branch but also don’t trust Congress and don’t trust federal judges to make sure that we’re abiding by the Constitution with due process and rule of law, then we’re going to have some problems here.”

We have a problem.

Just 30 percent of voters nationwide have that much trust in government officials when it comes to these surveillance efforts.

Only 24 percent now are confident that the federal government does the right thing most of the time.

Scott Rasmussen’s weekly newspaper column suggests that this popular distrust of government is the theme that ties all the recent news stories together. “It’s a driving force in the current policy debates over immigration, gun control, health care and more.” Scott adds, “Many in Washington are frustrated by the public distrust. They dream of public relations programs to overcome it. What is needed, though, is for the government to change its behavior, so that it can earn the trust of the people it serves.”

Right now, though, there are enough controversies to rank them in order. The NSA spying is seen as the top concern by 35% while 22% say the same about the IRS targeting of conservative political groups.  Lower on the list are questions about Benghazi and the Justice Department’s snooping of reporters. Among those following the story most closely, the IRS scandal is seen as the most serious.

The distrust factor is playing a huge role in the debate over immigration reform. While most voters support the concept of immigration reform, they do not trust the government to honor and enforce the border security provisions of any new law. That’s why most demand that border security comes first.

On health care, the president’s law is progressing along an uncertain path to implementation. However, just 39% of voters have a favorable opinion of it. The distrust factor is on display here as well. Despite government promises that the plan will reduce the cost of care, just 13% believe it will actually lead to lower costs.  Seventy-five percent (75%) think the health care law is likely to cost more than official estimates and 66% believe it will increase the federal budget deficit.

Only 19% believe the new law will improve the quality of care. Forty-eight percent (48%) believe it will hurt. With these concerns in mind, most Americans want the option of less health insurance coverage and more take home pay.  “In that environment, employers will compete to find the best mix of pay and benefits needed to recruit good employees,” Scott Rasmussen said in a recent newspaper column. “As consumers opt for less coverage and more take-home pay, they will effectively repeal a major portion of the president’s health care law.”

Similarly with gun control, Americans overwhelmingly like the idea of requiring background checks for those who want to purchase a gun, but they are very suspicious of where the president and New York Mayor Michael Bloomberg want to go from there. If voters were convinced their plan was for background checks and nothing more, it would have enjoyed broad popular support.

Voters still have more faith in state and local government than they do in the federal government. Not surprisingly, they want to see a number of issues currently handled by the feds shifted to government closer to home.

But, while trusted more, there remain concerns about overreach of local governments, too. In New York, city officials have filed discrimination complaints against a deli owner who requires their customers to dress modestly. Sixty-eight percent (68%) of Americans think business owners should be allowed to have such a modesty requirement. Just 22% disagree.

The Obama administration now says the Syrian government has crossed a red line by using chemical weapons on its population. As a result, the administration will provide military assistance to Syrian rebels. Earlier polling showed that, even with proof of a chemical weapons attack, just 31% of voters think the United States should provide military assistance to protect the Syrian rebels.

Deputy National Security Adviser for Strategic Communication Ben Rhodes said “Any future action we take will be consistent with our national interest, and must advance our objectives.” However, only 23% of all voters regard Syria as a vital national security interest of the United States. Rhodes said the U.S. interests “include achieving a negotiated political settlement to establish an authority that can provide basic stability and administer state institutions; protecting the rights of all Syrians; securing unconventional and advanced conventional weapons; and countering terrorist activity.”

In other news this week:




June 8 2013




The 3 Biggest Losers in a Slashed Defense Budget

By DAVID FRANCIS, The Fiscal Times

June 3, 2013


Imagine a military force that, for the first time since World War II, cannot fight and win a two-front war.

This version of the U.S. military would also fall behind on maintaining its nuclear deterrence. According to American Enterprise Institute’s Thomas Donnelly, this American military would not be able to accomplish U.S. strategic objectives in the coming years.

“All I can offer you are roads to failure, roads to defeat,” Donnelly said at a Center for Strategic and Budgetary Assessment event held last week . “The current program does not maintain a two-theater force.”

This event brought together leading think tanks in Washington to present competing visions of the Pentagon’s future, in light of sequestration and defense cuts that are expected to remove $500 billon from DOD’s budget over the next decade. Each think tank assumed scenarios in which the DOD is operating with $100, $300 and $500 billion less than it is now.

The analysis was meant to mirror an ongoing Pentagon process known as a Strategic Choice Exercise, in which the DOD is contemplating spending cuts that Pentagon brass have said will decimate the American military. For years, DOD officials acted as if these cuts would somehow be avoided.

The White House’s long-term spending plans released earlier this year did not account for the full impact of sequestration. But as President Obama made clear in a recent speech, the age of endless Pentagon spending is coming to an end. This realization has the massive Pentagon bureaucracy split. Last week, Army Chief of Staff Ray Odierno warned that shrinking the size of the Army could leave America vulnerable.

“The thing I worry about is that in everybody’s declaration that there’s going to be no more ground wars, we need no more ground forces, that we’re going to make the Army too small,” he said before a meeting of the Atlantic Council. “I see nothing on the horizon yet that tells me that we don’t need ground forces.”

Others within the Pentagon have begun to prepare for the inevitable. Defense Chief Chuck Hagel, along with a number of undersecretaries and high-level civilians, continue to tell the military to prepare for cuts.

“We’ll have to get smaller and we’ll have to look at some areas where we can take some more risk, get rid of more overhead and make a lot of other tough decisions,” Pentagon Comptroller Robert Hale said recently.

A Pentagon spokesperson said their review had not yet been completed and delivered to Hagel, and that DOD had not yet determined if it would be made public once it’s done.

But it’s clear that the drawdown won’t hit everyone equally. There are three areas where the cuts will be especially severe and will have an impact far beyond the ability for the United States to wage war.



The Pentagon currently has a civilian workforce of some 800,000. Earlier this year, Hagel said that one of the most important parts of DOD budget reduction was to drastically reduce the size of this workforce.

“Despite good efforts and intentions, it is still not clear that every option has been exercised or considered to pare back the world’s largest back office,” Hagel said in April. He has argued that DOD would save as much as $34 billion per year by shrinking the civilian workforce.

The think tanks involved in the strategic review agree with Hagel’s assessment. Their cuts in the civilian workforce range from a loss of 82,000 to 263,000 workers.



Despite Odierno’s protests, nearly everyone expects the Army to shrink. This will occur through base closures and troop reduction.

The think thanks estimated that a round of base closings, known as BRAC, could save between $5 and $30 billion in the coming years. This process is likely to impact the Army disproportionately, as it has the largest number of bases.

The think tanks’ analyses also recommend reductions of Army ground forces from between 70,000 and 163,000 troops. Army reserves would be pared down by up to 58,000 reservists.



The think tank review did not address the impact of these cuts on the wider economy. And they did not address how the downsizing of the Pentagon contracting process, an inevitable result of a cut in DOD spending and one that has yet to be fully understood, would impact national GDP growth.

Because defense spending has accounted for such a large portion of government spending in recent years, including nearly 20 percent in 2012, the new changes outlined by the think tanks would be felt by not just those connected to the Pentagon, but for all Americans.

And everyone in the large defense policy community is in agreement that these cuts must occur in some way and at some level, meaning that DOD and Congress are likely to draw the same conclusions soon.

“Given the range of strategies you’ve heard here, there are some things that everyone agrees on … regardless of which strategy you end up pursuing, these are things that are likely to happen,” CSBA senior fellow Todd Harrison said last week. “It’s pretty remarkable, [that] people across the aisle, across a broad political spectrum, can agree on these things and yet Congress can’t.”



White House Unveils First-Ever Inventory of Federal Programs

By Charles S. Clark

May 31, 2013


Embarking on a project long-sought by lawmakers and government efficiency experts, the Office of Management and Budget on Friday released a unique inventory of federal programs in 24 departments.

Report at


As called for in the 2010 Government Performance and Results Modernization Act, the inventory of some 1,600 programs was prepared by departments using a Government Accountability Office definition that deemed a program an “organized set of activities directed toward a common purpose or goal that an agency undertakes or proposes to carry out its responsibilities.” Each program listed includes a title, description and link to the strategic goals and objectives of the agency.

OMB spokeswoman Ari Isaacman Astles said “each agency further defined programs in a way that reflects how the agency delivers and discusses its activities with Congress and other stakeholders.”

In meeting Friday’s deadline for the inventory’s first phase, agencies posted their lists on their own websites and as discrete units on After agencies receive feedback on the program definitions, an updated release next year will add detail to the entries, supply links to further information and centralize them within

For years, many in Congress have expressed frustration at the inability of agencies to precisely itemize all the programs that receive funding. Such information has been scattered across the president’s budget, congressional budget justifications,, and the Catalog of Federal Domestic Assistance. An OMB backgrounder notes that “reporting about the government’s activities is often done in a siloed and decentralized way, which inhibits coordination across agencies and cross-cutting analysis across programs.”

Sen. Tom Coburn, R-Okla., in an essay earlier this year titled “Duplication Nation,” complained that “it is nearly impossible to fully comprehend the vast expanse of federal programs that exist today. Although various sources, including USA Spending and documents released by OMB and CBO produce partial lists of various government programs, there is not an exhaustive list of federal programs. Every federal department is now administering programs that address challenges tasked to be addressed by other agencies. The government has grown so large and unmanageable, that even the experts, and the departments themselves, cannot compile a list of all federal programs within their purview.”

In releasing the first inventory, which focused on permanent programs, the White House linked it to President Obama’s ongoing efforts to eliminate duplication and waste. “In each of his five budgets, the president identified, on average, more than 170 cuts, consolidations, and savings, totaling about $25 billion each year,” it said. “The 2014 Budget proposes 215 cuts, consolidations, and savings proposals, which are projected to save more than $25 billion in 2014.”

The Obama team also cited its progress in implementing GAO recommendations for curbing duplicative programs, and referenced its pursuit — again under the GPRA law — of cross-agency priority goals to improve government efficiency.

The release was hailed “an important tool” by Sen. Tom Carper, D-Del., chairman of the Senate Homeland Security and Governmental Affairs Committee. “If implemented correctly, the Federal Program Inventory can also assist agencies and Congress by aligning programs with goals and desired outcomes so that these programs can be managed more efficiently.” But more work is needed, Carper said, promising to work with colleagues and OMB to “refine the list and to fully implement the Performance Act.”


Sen. Mark Warner, D-Va., also welcomed the release, saying, “We are finally getting the data we need to make a careful review of the structure of federal programs across the government to help inform decision-making in Congress. Identifying the programs at each agency is critical to making sure we reduce the amount of duplication, improve efficiencies, and ensure that programs with similar missions work together across agency silos. We will review these lists carefully, and I look forward to the addition of more program information that will help identify areas for collaboration and taxpayer savings.”

Robert Shea, an OMB official during the George W. Bush administration and now a principal with Grant Thornton LLP, called the release “a good start — almost like a to-do list. If we have a bird’s eye view of the programs out there, we can begin to catalogue what they cost and what we’re getting in terms of results, and start diligently deciding whether these are areas ripe for consolidation or improvement or elimination,” he told Government Executive.

“There’s probably a lot of confusion as to why it takes so long for government to get a handle on what’s going on in government,” Shea added. “These recent scandals have raised the specter of a government too large to be managed. This makes the case too that if we’ve got so many different ways of achieving similar objectives, is that really manageable?”

David Walker, the former Comptroller General now chairing an advocacy coalition called the Government Transformation Initiative, praised the first effort but said more needs to be done. “The creation of a governmentwide inventory of federal programs helps facilitate the coordination of activity across the federal government,” he said in a statement. “We already have numerous compilations of government programs with the Federal Activities Inventory Reform Act, GAO Reports and inspector general reviews, but they lack actionable recommendations to ensure the government is operating as efficiently and effectively as possible.”

The inventory, Walker said, demonstrates “the need for an independent task force or commission to make specific and actionable recommendations to improve the economy, efficiency, effectiveness, and credibility of the federal government.”

DoD Inventory report at


Unemployment Compensation for Furloughed Feds?

By Eric Katz

May 31, 2013


Some federal employees will take week-long furloughs in an attempt to collect unemployment insurance, according to a CNN report.

A union local in Philadelphia — an affiliate of the International Federation of Professional and Technical Engineers — has negotiated with Navy officials to allow the bunched furloughs, rather than the one-day per week schedule outlined by Defense Secretary Chuck Hagel. Navy employees would not qualify for unemployment in Pennsylvania if they still worked four days of the week, as recipients must earn less than $745 per seven-day period. “We will begin furloughs on July 8 at the rate of one furlough day per week for most personnel,” Hagel wrote in furlough notices that went out this week.

He did leave the door open for some flexibility, however.


“Consistent with this memo and with applicable laws and rules, commanders and managers will have the authority to develop the specifics of furlough procedures in order to minimize adverse mission effects and also limit the harm to morale and productivity,” he wrote. “Further bargaining with unions may also be required.”

A Pentagon spokeswoman said the approach was unusual, but not prohibited.

“Employees may request a specific furlough schedule,” Cmdr. Leslie Hull-Ryde told Government Executive. “However, the department’s approach has generally been one day a week, two days per pay period not to exceed a total of 88 hours.”

One employee, according to CNN, hoped to recoup about one quarter of the $4,400 he will lose due to unpaid leave through unemployment.

The Labor Department has issued several pieces of guidance on which federal employees may be eligible for unemployment and how to go about collecting it.

“While on furlough, federal employees may become eligible for unemployment benefits under the Unemployment Compensation for Federal Employees,” Labor wrote in one memo. “Some states may increase normal working hours in order to process the high volume of UCFE claims, if necessary.”

The department advised in another document: “In most cases, individuals on furlough status for only a couple of days during a week would not be eligible for a partial payment for that week.”

Labor bills agencies for UCFE benefits each quarter, which could in turn negate some of the savings obtained by furloughs. The Office of Management and Budget did not respond to a request for comment.

Not all employees plan to take the lumped-furlough approach to become eligible for the out-of-work benefit, however. Tim Kauffman, a spokesman for the American Federation of Government Employees — the largest federal employee union — said one unpaid day off per week has already been negotiated and finalized for Defense civilians.

“My understanding is that was what was agreed to in negotiations, which are already complete,” Kauffman said.

Hull-Ryde, the Defense spokeswoman, said unemployment compensation requirements differ by state and pointed to a Labor Department tool to help federal employees determine their eligibility.



Pentagon, Regional Staffs Growing Despite Orders to Trim Personnel

Data Show 15% Increase From 2010 to 2012


Jun. 2, 2013 – 08:57PM |



Staffs Grow

Overall, staff sizes of major US military commands grew by 15 percent from 2010 to 2012, despite then-Defense Secretary Robert Gates’ call to reducestaff sizes as a way of cutting redundancy and saving money.

Organization; Size 2010; Size 2012; Change; % Change

OSD: 2,433; 2,665; +232; 9.5%

Joint Staff: 1,286; 4,244; +2958; 230%

AFRICOM: 1,661; 1,919; +285; 15.5%

CENTCOM: 2,686; 3,207; +521; 19.4%

EUCOM: 2,494; 2,286; -208; 8.3%

NORTHCOM: 1,585; 1,687; +102; 6.4%

PACOM: 3,825; 4,147; +322; 8.4%

SOUTHCOM: 1,795; 1,797; +2; 0.1%


WASHINGTON — The size of the Pentagon’s vast oversight organizations grew by more than 15 percent from 2010 to 2012, despite efforts to pare down the US Defense Department’s bureaucracy, a Defense News analysis has found.

On Aug. 9, 2010, then-Defense Secretary Robert Gates said the Pentagon needed to cut staff sizes. He made this task part of his efficiencies initiative — an effort to save hundreds of billions of dollars through better business practices. The military services’ incentive for accomplishing these tasks was that they would be able to get back some of that money to reinvest in other priorities.

“Constraining the personnel available is one way to force this painful but necessary process to take place,” Gates said then. “Therefore, I am directing a freeze on the number of OSD [Office of the Secretary of Defense], defense agency and combatant command [COCOM] positions, at the FY10 levels, for the next three years.”

But almost three years later, staff sizes within OSD, the Joint Staff and COCOMs have grown, prompting a new round of calls from senior Pentagon officials and defense observers to truncate the so-called “fourth estate.”

The Joint Staff, for example, grew from 1,286 people in 2010 to 4,244 people in 2012, a 230 percent increase.

“The problem is the bureaucracy is more resilient than even the most powerful secretary,” said Arnold Punaro, a retired Marine Corps major general, consultant and member of the Defense Business Board.

Unlike prior efforts to cut back staff positions, however, DoD officials have more of an incentive to do so now, experts say. Since DoD’s budget is capped and with more defense spending cuts on the horizon, maintaining staff size means reducing spending in other areas, such as training, research and weapon procurement.

Even though staff sizes grew over the past three years despite efforts to freeze or reduce them, Defense Secretary Chuck Hagel has made reducing this type of overhead a major priority.

Hagel signaled the rise in overhead costs during his first major policy speech in early April at National Defense University. He said DoD needs to “pare back the world’s largest back office.

“Prior efficiency campaigns yielded substantial savings from the services, and some from the DoD elements known as the ‘Fourth Estate,’ which consists … of the Office of the Secretary of Defense, the Joint Staff, the combatant commands and the defense agencies and field activities — the Missile Defense Agency as well as those that provide health care, intelligence and contracting support,” Hagel said. “We need to relook at funding for these activities, which won’t be easy.”


The Growth

Between 2010 and 2012, OSD, the Joint Staff and COCOMs added about 4,500 positions, according to a Defense News analysis of multiple DoD personnel documents and interviews with experts. More than 65 percent of the staff size growth was within the Joint Staff, the organization at the Pentagon that oversees the uniformed military and global operations.

The staff sizes do not include the thousands of contractors working within each organization.

The majority of the growth within the Joint Staff stems from the closure of US Joint Forces Command (JFCOM), which promoted and organized training between the military services. Gates ordered the closure of JFCOM in 2010.

That year, the Joint Staff had just shy of 1,300 military and civilian positions. In 2012, that number rose to more than 4,200.

About 2,500 of these positions were directly attributed to absorbing Joint Forces Command’s duties and responsibilities, Joint Staff spokesman Lt. Col. Larry Porter said.

“They say they closed JFCOM,” Punaro said. “They did not close … they added them to the Joint Staff.”

This year, the Joint Staff has cut more than 1,000 positions and has about 3,100 military and civilian billets, Porter said. And more cuts are on the way as another 100 positions — possibly more — are marked for elimination in 2014.

Collectively, the Joint Staff and COCOMs are planning to cut 400 headquarters staff positions over the next five years, which the Pentagon says is part of a nearly $900 million overhead savings plan, according to DoD data within a Government Accountability Office (GAO) report.

OSD is authorized to have 2,540 positions, said a Pentagon spokesman, Lt. Col. Tom Crosson, slightly down from 2,665 in 2012.

But when contractors are included, that number is much higher. When Punaro looked into Pentagon staff sizes for Gates, he found about 5,000 people — civilian, military and contractors — working at OSD.

“They need to bite the bullet in their own backyard if you are the secretary of defense and you want the rest of the Pentagon to tighten their belt,” Punaro said.


Trimming the Fat

DoD operates six COCOMs that oversee military operations in different parts of the world. Each of those commands is supported by a subordinate, service-specific command. For example, Army Pacific, Marine Forces Pacific, Pacific Fleet and Pacific Air Forces are the service components of US Pacific Command.

Experts say DoD can trim more fat there by eliminating redundant positions across the COCOMs and the subordinate service commands.

In a May report, GAO — the nonpartisan, investigative arm of Congress — found significant overlap between the service-supporting commands and the COCOMs. Overlap is common in a variety of positions, such as collecting intelli­gence, coordinating operations, performing strategic planning and policy and supporting communications.

“Even though the combatant commands rely on the service component commands’ personnel to support their missions and operational requirements, they do not have oversight or visibility into the service component commands’ authorized manpower or how the components determine the size and structure of their staff to support the combatant commands’ missions,” GAO said. “Based on our analysis of data that we gathered, in fiscal year 2012, there were 7,795 authorized positions at the headquarters of the service component commands, which was more than double the 3,817 authorized positions at the headquarters of the combatant commands.”


Moreover, the COCOMs do not have clear information regarding personnel assigned to the supporting service commands.

Since Sept. 11, 2001, civilian positions at the COCOMs — not including US Central Command — almost doubled from 2,370 in 2004 to 4,450 in 2012, according to GAO. However, the number of authorized military positions decreased about 9 percent from 6,250 to 5,670 across that same period.

The headquarters support costs, including civilian pay, contract services, travel and equipment reviewed by GAO at the COCOMs, more than doubled from $500 million in 2007 to $1.1 billion in 2012. Contract services and civilian pay were the primary drivers of the increase, GAO said.

During a budget-cutting drill conducted by four Washington think tanks — the Center for Strategic and Budgetary Assessments (CSBA), Center for Strategic and International Studies, Center for New American Security and American Enterprise Institute — teams from each cut DoD’s nearly 800,000 civilian workforce by between 82,000 and 263,000 people.

“I think that that’s remarkable, given the range in strategies, the range of view of people [participating] that they had so much agreement in this area in particular,” said Todd Harrison, a CSBA analyst who helped organize the drill.

The Pentagon also estimates it has more than 700,000 contractors working alongside its civilian and military workforce, but the exact number is unknown.

“They cannot really tell you, [and] Congress is frustrated,” Punaro said.

Estimates peg the number of contractors as high as 700,000, around the same size as the Pentagon’s entire civilian workforce.

“In business, if you cannot control your headcount, you are doomed,” Punaro said. “So, they do not control their headcount, and they do not have mechanisms to control the headcount. There ought to be someone that owns headcount, and you cannot increase it without higher authority.”


3-D printing goes from sci-fi fantasy to reality

Associated PressBy MARTHA MENDOZA | Associated Press – Sun, Jun 2, 2013..


SAN MATEO, Calif. (AP) — Invisalign, a San Jose company, uses 3-D printing to make each mouthful of customized, transparent braces. Mackenzies Chocolates, a confectioner in Santa Cruz, uses a 3-D printer to pump out chocolate molds. And earlier this year, Cornell University researchers used a 3-D printer, along with injections of a special collagen gel, to create a human-shaped ear.

Once a science-fiction fantasy, three-dimensional printers are popping up everywhere from the desks of home hobbyists to Air Force drone research centers. The machines, generally the size of a microwave oven and costing $400 to more than $500,000, extrude layer upon layer of plastics or other materials, including metal, to create 3-D objects with moving parts.

Users are able to make just about anything they like: iPad stands, guitars, jewelry, even guns. But experts warn this cool innovation could soon turn controversial — because of safety concerns but also the potential for the technology to alter economies that rely on manufacturing.


“We believe that 3-D printing is fundamentally changing the manufacturing ecosystem in its entirety — how and where products are made and by whom,” said Peter Weijmarshausen, CEO of New York-based Shapeways, an online company that makes and sells 3-D printed products designed by individuals. Products include a delicate, twig-like egg cup (cost: $8.10) and a lamp that looks like a nuclear mushroom cloud (cost: $1,388.66).

“We’re on the verge of the next industrial revolution, no doubt about it,” added Dartmouth College business professor Richard D’Aveni. “In 25 years, entire industries are going to disappear. Countries relying on mass manufacturing are going to find themselves with no revenues and no jobs.”

On ground, sea or air, when parts break, new ones can be made on the spot, and even the tools to install them can be made, eliminating the need for staging parts in warehouses around the world, said Jeff DeGrange, vice president of Direct Digital Manufacturing at Stratasys Inc., currently the industry leader in a field of about 50 3-D printer companies.

“We’re going to see innovation happening at a much higher rate, introduction of products at a much higher rate,” said DeGrange. “We live in an on-demand world now, and we’ll see production schedules are going to be greatly compressed.”

Airplane mechanics could print a replacement part on the runway. A dishwasher repairman could make a new gasket in his service truck. A surgeon could print a knee implant custom-designed to fit a patient’s body.

But the military, D’Aveni said, is likely to be among the first major users of 3-D printers, because of the urgency of warfare.

“Imagine a soldier on a firebase in the mountains of Afghanistan. A squad is attacked by insurgents. The ammunition starts to run out. Is it worth waiting hours and risking the lives of helicopter pilots to drop it near you, or is it worth a more expensive system that can manufacture weapons and ammunition on the spot?” he said.

In the past two years, the U.S. Defense Department has spent more than $2 million on 3-D printers, supplies and upkeep, according to federal contract records. Their uses range from medical research to weapons development. In addition, the Obama administration has launched a $30 million pilot program that includes researching how to use 3-D printing to build weapons parts.

NASA is also wading into this arena, spending $500,000 in the past two years on 3-D printing. Its Lunar Science Institute has published descriptions of how it is exploring the possibility of using the printers to build everything from spacecraft parts while in orbit to a lunar base.

While the U.S. is pursuing the military advantages of 3-D printing, it’s also dealing with the potential dangers of the technology. On May 9, the State Department ordered a group to take down online blueprints for a 3-D printable handgun, and federal lawmakers and some state legislatures are contemplating proposals to restrict posting weapons plans in the future.

Since 2007, when these printers first entered the mainstream marketplace, sales have grown by 7.2 percent each year, according to IBIS World, a company that tracks the industry. Sales are projected to jump from about $1.7 billion in 2011 to $3.7 billion in 2015.

Cliff Waldman, a senior economist at the Manufacturers Alliance for Productivity and Innovation, a group that promotes the role of manufacturing in global economies, said it’s still too soon to know exactly what impact this 3-D technology could have on more traditional manufacturing. However, he doesn’t envision it changing the “fundamental shape” of manufacturing, as others suggest.

“I think 3-D has the capacity to impact both products and processes,” he said. “I am not ready to say that it is completely disruptive, however. It might be in a few narrow industries.”

Starting in June, office supply chain Staples plans to be the first major retailer to supply 3-D printers with “the Cube,” a plug-in device that uses 16 colors and costs $1,299. And in September the smallest and cheapest 3-D printer on the market — a printing pen priced from $50 — is due to start shipping. Similar to a glue gun, the 3Doodler plugs into the wall and is filled with cylinders of plastic that come out of a 518-degree Fahrenheit tip. Once the plastic leaves the pen it cools and hardens.

Makers Peter Dilworth, an inventor at the Massachusetts Institute of Technology, and Maxwell Bogue, a toy maker, first pitched their pens earlier this year on a website for startup projects. They sought $30,000 and wound up collecting $2.3 million from more than 26,000 investors, who each got one of the 3-D pens. Four artists who teamed up with the men have used the pens to make a mini Eiffel Tower, earrings and butterfly pendants.




House panel passes $638B defense authorizations bill

The Hill

By Jeremy Herb – 06/06/13 02:31 AM ET


The House Armed Services Committee passed its sweeping Defense authorization early Thursday morning, authorizing $638 billion in defense spending.

The Pentagon policy bill includes stripping commanders’ ability to overturn guilty verdicts to deal with a rise in military sexual assaults, a prohibition on transferring Guantánamo detainees to the United States and a rejection of new base closures.

The committee passed its authorization bill on a 59-2 vote after a 16-hour mark-up, which began Wednesday morning and lasted until 2:14 a.m. Thursday. The bill will be debated on the House floor next week.

Reps. Jackie Speier (D-Calif.) and John Garamendi (D-Calif.) were the only committee members to vote against the final legislation.

The bill provides an increase of $5.1 billion for the war in Afghanistan from the Pentagon request, and it sets base Pentagon spending
at $526.6 billion, the same amount that was requested in President Obama’s budget.

That funding level is $52.2 billion over the budget caps set by sequestration, however, a topic that was debated only sparingly on Wednesday.

Because the panel’s bill was over the budget caps — as are the Senate and Obama administration budgets — the Pentagon could be facing another across-the-board cut in 2014 if sequester is not averted.

“I think in this committee there’s a growing awareness that sequestration is a fact of life, so whatever we do here today will wind up being reduced by a significant amount,” said Rep. Adam Smith (D-Wash.), the top Democrat on the committee.

After 1 a.m. Thursday, the committee debated a measure from Rep. Jim Cooper (D-Tenn.) to give the Pentagon $20 billion in flexibility to transfer funds between accounts, up from $3.5 billion granted in the authorization bill.

Pentagon officials have warned that the military is facing shortfalls in its operations and maintenance accounts and they’ve said flexibility could help alleviate the problem to some degree.

“This is insanity, and so far this committee has not done anything about it,” Cooper said. “I’m not saying this is a perfect solution, but it is a start… I am sorry it is one in the morning before we face the elephant in the room.”

The amendment failed, however, on a 16-45 vote, and was opposed by both Armed Services Chairman Buck McKeon (R-Calif.) and Smith.

“What you’re asking is to give full discretion for $20 billion. That really doesn’t solve the problem,” McKeon said. “What we really need to do is get rid of the sequestration.”

As the Pentagon faces tightened budgets, Smith criticized the committee for rejecting several cost-cutting measures that the Pentagon had requested, including a new round of base closures and new healthcare fees.

“I don’t think this committee has the luxury to be so parochial,” Smith said.

He was clearly in the minority, however, as the committee rejected his amendment 44 to 18 to reverse a restriction on Pentagon planning for future base closures.

Smith also fought a losing battle to lift a restriction on transferring detainees from Guantánamo onto U.S. soil, which has been included in the past several Defense authorization bills. The prohibition is a key roadblock to President Obama’s new push to close the detention facility.

Just as the full House passed a restriction on building U.S. facilities in the military construction appropriations bill, the committee rejected Smith’s amendment on a 23-38 vote.

The panel did not get into a major debate on military sexual assault Wednesday, but the bill included significant changes to the military’s judicial code in order to deal with sexual assault.

The panel included legislation from Reps. Mike Turner (R-Ohio) and Niki Tonsgas (D-Mass.) that stripped military commanders’ authority to overturn guilty verdicts in a post-trial review.

The adopted measure also established a mandatory sentence of dismissal or dishonorable discharge for service members who were convicted of sexual assault.

There has been a major push in Congress to tackle sexual assault in the military on the heels of a Pentagon report estimating 26,000 assaults last year, up from 19,000 in 2010.

The committee’s markup did not address proposals to make larger changes to the military’s judicial code, including one from Rep. Speier to take sexual assault cases outside the chain of command.

Speier told The Hill on Wednesday that she was working with Republicans to get a vote on her amendment when the authorization bill goes to the floor next week.

Top military leaders expressed uniform opposition to taking cases outside the chain of command during a Senate hearing on Tuesday.

The most partisan debate in the committee Wednesday focused on missile defense issues, in particular Republican plans to build a new East Coast missile site by 2018.

The committee passed an amendment from Turner to direct $140 million for construction on the site by a 33-27 vote.

“It is imperative that we move quickly to ensure that our missile-defense system is expanded and it is completed,” Turner said.

Democrats argued that the technology is not yet ready and it would be premature to begin building a new site.

“This is too much money, too early to be helpful to the security of the American people,” said Cooper, the ranking member of the Strategic Forces subcommittee.

The Republicans on the panel also tangled with Democrats over the Pentagon’s biofuels program, a frequent target of GOP lawmakers for cuts.

The committee pushed back on a number of weapons programs that the Pentagon wanted to retire, including the Global Hawk Block 30 drone and seven cruisers and two amphibious warships.

Democrats also expressed concerns about the potential for discrimination against gay service members after the committee expanded a “conscience clause” for military chaplains in last year’s bill. The amendment from Rep. John Fleming (R-La.) stated that the military had to accommodate service members’ actions and speech, in addition to beliefs, which was the current language.

The Senate Armed Services Committee will be marking up its version of the Defense authorization bill in closed session next week.

— Carlo Muñoz contributed.


Pentagon officials ask Congress to shift $9.6B

The Hill

By Zack Colman – 05/18/13 10:53 AM ET


The Pentagon wants Congress to shift $9.6 billion of this year’s FY13 Defense Department budget toward expenses for the Afghanistan war, transportation and other items.

Moving $1.3 billion to the Army “to support funding shortfalls” in Afghanistan is the largest request, Bloomberg reports, citing budget documents. The Pentagon also wants to funnel $1 billion into the transportation budget to pay for higher-than-expected fuel costs.

The money would mostly come from smaller research programs and weapons-buying accounts, according to Bloomberg.

The Pentagon sent the request to Congress along with its $79.4 billion war-fighting budget proposal. The House Armed Services Committee will begin consideration of the defense bill next week.

The Defense Department is in the process of scaling back spending, as sequestration will shave $500 billion from its budget over the next 10 years.

The Pentagon said failing to honor its request “runs the risk of an interruption on the flow of supplies, subsistence and mail to deployed warfighters” in Afghanistan.

The Army also needs $770 million to pay for 8,400 more offices, the Pentagon said in its request.


Sixty Percent of Adults Can’t Digest Milk


By Elizabeth Weise, USA TODAY

June 6, 2013


Got milk? If you do, take a moment to ponder the true oddness of being able to drink milk after you’re a baby.

No other species but humans can. And most humans can’t either.

The long lists of food allergies some people claim to have can make it seem as if they’re just finicky eaters trying to rationalize likes and dislikes. Not so. Eggs, peanuts, tree nuts, fish, shellfish soy and gluten all can wreak havoc on the immune system of allergic individuals, even causing a deadly reaction called anaphylaxis.

But those allergic reactions are relatively rare, affecting an estimated 4% of adults.

Milk’s different.

First off, most people who have bad reactions to milk aren’t actually allergic to it, in that it’s not their immune system that’s responding to the milk.

Instead, people who are lactose intolerant can’t digest the main sugar —lactose— found in milk. In normal humans, the enzyme that does so —lactase— stops being produced when the person is between two and five years old. The undigested sugars end up in the colon, where they begin to ferment, producing gas that can cause cramping, bloating, nausea, flatulence and diarrhea.

If you’re American or European it’s hard to realize this, but being able to digest milk as an adult is one weird genetic adaptation.

It’s not normal. Somewhat less than 40% of people in the world retain the ability to digest lactose after childhood. The numbers are often given as close to 0% of Native Americans, 5% of Asians, 25% of African and Caribbean peoples, 50% of Mediterranean peoples and 90% of northern Europeans. Sweden has one of the world’s highest percentages of lactase tolerant people.

Being able to digest milk is so strange that scientists say we shouldn’t really call lactose intolerance a disease, because that presumes it’s abnormal. Instead, they call it lactase persistence, indicating what’s really weird is the ability to continue to drink milk.

There’s been a lot of research over the past decade looking at the genetic mutation that allows this subset of humanity to stay milk drinkers into adulthood.

A long-held theory was that the mutation showed up first in Northern Europe, where people got less vitamin D from the sun and therefore did better if they could also get the crucial hormone (it’s not really a vitamin at all) from milk.

But now a group at University College London has shown that the mutation actually appeared about 7,500 years ago in dairy farmers who lived in a region between the central Balkans and central Europe, in what was known as the Funnel Beaker culture.

The paper was published this week in PLoS Computational Biology.

The researchers used a computer to model the spread of lactase persistence, dairy farming, other food gathering practices and genes in Europe.

Today, the highest proportion of people with lactase persistence live in Northwest Europe, especially the Netherlands, Ireland and Scandinavia. But the computer model suggests that dairy farmers carrying this gene variant probably originated in central Europe and then spread more widely and rapidly than non-dairying groups.

Author Mark Thomas of University College London’s dept of Genetics, Evolution and Environment says: “In Europe, a single genetic change…is strongly associated with lactase persistence and appears to have given people with it a big survival advantage.”

The European mutation is different from several lactase persistence genes associated with small populations of African peoples who historically have been cattle herders.

Researchers at the University of Maryland identified one such mutation among Nilo-Saharan-speaking peoples in Kenya and Tanzania. That mutation seems to have arisen between 2,700 to 6,800 years ago. Two other mutations have been found among the Beja people of northeastern Sudan and tribes of the same language family in northern Kenya.


How Gov’t Student Loans Ruined College Education

By LIZ PEEK, The Fiscal Times

June 5, 2013

President Obama and Congress are squabbling again – this time over the rates charged on federal college loans.


Surrounded by students nicely turned out in suits and dresses, looking more like the Mormon Youth Chorus than today’s undergraduates, Mr. Obama recently chastised Congress for not yet blocking a doubling of rates for new Stafford loans set to occur on July 1.

As the president well knows, the House has already passed a bill preventing the hike and tying new loan terms to market levels. The president’s solution is similar, but would lock in rates for the duration of the loan. The spat is like bickering over menu choices on the Titanic.

Between 2000 and 2010, the number of students enrolling in degree-conferring institutions increased 34 percent. The portion receiving federal aid skyrocketed from 31.6 percent to 47.8 percent, and the average award nearly doubled. In addition, the percentage taking out student loans climbed from 40.1 percent to 50.1 percent, and the average borrowing rose 76 percent.

The ramp-up in loans to students has not only driven up costs but has undermined the value of a college degree. Some 30 percent of people ages 25 to 29 are college graduates today, up from 12 percent in the 1970s. That is a notable achievement, unless the degrees awarded do not satisfy the needs of the job market. Richard Vedder, economics professor at Ohio University, has written that we have one million retail sales clerks and 115,000 janitors with college diplomas. At the same time, one fifth of the country’s managers say they can’t find skilled workers to fill job openings. Something is not right.

Rising student debt is a menace–not just to the families involved but also to the economic recovery. As with housing, the government’s well-intentioned effort to make advanced education available to all has led to crippling borrowing by millions of Americans. As with housing policy, it is time for a clear-headed review of how we can promote sensible spending on advanced education. That might start with challenging why Congress – or the president – should be responsible for fixing student loan rates in the first place.

Many of the problems challenging our higher education system could be resolved, or might have been prevented, by allowing greater input from the marketplace. The government’s 2010 take-over of student lending and prior 45 years of subsidizing student borrowings threw rational credit analysis out the window. A family’s earnings and debt profile were reviewed, but the applicant’s potential earning power was not part of the equation.

Lenders with skin in the game might have analyzed the income prospects of young people in different fields of study, and channeled more money to pre-med or software design than to philosophy or journalism majors, for instance. President Obama has long bemoaned our shortage of STEM grads. In a more rational world, higher income in engineering and tech would have attracted more students (and lenders) to those fields.

Many will argue that seeking a college degree is not just about financial returns. Lawmakers, however, have long used economics to persuade taxpayers to underwrite our colleges and universities. We are told we cannot compete in a global economy without a highly educated workforce. That is true, but it has become clear that not all training is the same.

To date, lawmakers concerned about rising student debt have focused scrutiny on for-profit schools, which have been plagued by high costs and failure rates. They should be looking at the whole system.


While household debt has been declining gradually during the recession, student borrowings have increased – rising another $20 billion in the latest quarter. Some 43 percent of twenty-five-year-olds owe student debt today, up from 25 percent in 2003. Between 2003 and 2012, the average student loan balance increased 91 percent, from $10,649 to $20,326.

The rise in student debt– now at $986 billion, triple the level outstanding in 2004 – has held back the recovery, and especially the housing and auto markets. Though student loans make up only 9 percent of total household borrowings, they are mainly held by young people – who are essential to new household formations. The average age of the first-time home buyer is 30, and most borrow to finance their purchase.

A study by the New York Fed shows that historically, most first-time home buyers have student debt – they are typically higher up the income scale and better able to finance a home. However, during the recovery, home buying by people holding student debt took a nosedive. Similarly, people saddled with student debt were less likely to borrow to buy a car. In short, rising student debt pushed out other consumer loans.

Partly, this trend is explained by a tightening of credit standards. The Fed study shows that while in the past scores were not much impacted by student borrowings, during the recession that changed. By 2012, the average credit score for a 25-year old without student debt was 15 points higher than the counterpart who took out college loans; for those aged thirty, the gap was 24 points. The Fed study concludes that while “highly skilled young workers” have always been a boon to the economy, “unprecedented student debt may dampen their influence in today’s marketplace.”

Families are wising up to the “value proposition” of a college degree, checking out websites like to compare future earnings and real costs. The Obama administration has also acknowledged the need for more discriminating choices, while still pushing broader college enrollment.

The biggest reform of all, and the only one that might restore sanity and discipline to our higher education system, is to return student lending to the private sector – this time without the carte blanche of federal guarantees. To give low-income families a shot at attending college, direct grants awarded by scholastic merit would be, in the end, a less distorting approach.

Most likely, Congress will extend today’s Stafford loan rate for another year, and that pile of cans down the road will continue to build.

President Obama and Congress are squabbling again – this time over the rates charged on federal college loans.

Surrounded by students nicely turned out in suits and dresses, looking more like the Mormon Youth Chorus than today’s undergraduates, Mr. Obama recently chastised Congress for not yet blocking a doubling of rates for new Stafford loans set to occur on July 1.

As the president well knows, the House has already passed a bill preventing the hike and tying new loan terms to market levels. The president’s solution is similar, but would lock in rates for the duration of the loan. The spat is like bickering over menu choices on the Titanic.

Between 2000 and 2010, the number of students enrolling in degree-conferring institutions increased 34 percent. The portion receiving federal aid skyrocketed from 31.6 percent to 47.8 percent, and the average award nearly doubled. In addition, the percentage taking out student loans climbed from 40.1 percent to 50.1 percent, and the average borrowing rose 76 percent.

The ramp-up in loans to students has not only driven up costs but has undermined the value of a college degree. Some 30 percent of people ages 25 to 29 are college graduates today, up from 12 percent in the 1970s. That is a notable achievement, unless the degrees awarded do not satisfy the needs of the job market. Richard Vedder, economics professor at Ohio University, has written that we have one million retail sales clerks and 115,000 janitors with college diplomas. At the same time, one fifth of the country’s managers say they can’t find skilled workers to fill job openings. Something is not right.

Rising student debt is a menace–not just to the families involved but also to the economic recovery. As with housing, the government’s well-intentioned effort to make advanced education available to all has led to crippling borrowing by millions of Americans. As with housing policy, it is time for a clear-headed review of how we can promote sensible spending on advanced education. That might start with challenging why Congress – or the president – should be responsible for fixing student loan rates in the first place.

Many of the problems challenging our higher education system could be resolved, or might have been prevented, by allowing greater input from the marketplace. The government’s 2010 take-over of student lending and prior 45 years of subsidizing student borrowings threw rational credit analysis out the window. A family’s earnings and debt profile were reviewed, but the applicant’s potential earning power was not part of the equation.

Lenders with skin in the game might have analyzed the income prospects of young people in different fields of study, and channeled more money to pre-med or software design than to philosophy or journalism majors, for instance. President Obama has long bemoaned our shortage of STEM grads. In a more rational world, higher income in engineering and tech would have attracted more students (and lenders) to those fields.

Many will argue that seeking a college degree is not just about financial returns. Lawmakers, however, have long used economics to persuade taxpayers to underwrite our colleges and universities. We are told we cannot compete in a global economy without a highly educated workforce. That is true, but it has become clear that not all training is the same.

To date, lawmakers concerned about rising student debt have focused scrutiny on for-profit schools, which have been plagued by high costs and failure rates. They should be looking at the whole system.

While household debt has been declining gradually during the recession, student borrowings have increased – rising another $20 billion in the latest quarter. Some 43 percent of twenty-five-year-olds owe student debt today, up from 25 percent in 2003. Between 2003 and 2012, the average student loan balance increased 91 percent, from $10,649 to $20,326.

The rise in student debt– now at $986 billion, triple the level outstanding in 2004 – has held back the recovery, and especially the housing and auto markets. Though student loans make up only 9 percent of total household borrowings, they are mainly held by young people – who are essential to new household formations. The average age of the first-time home buyer is 30, and most borrow to finance their purchase.

A study by the New York Fed shows that historically, most first-time home buyers have student debt – they are typically higher up the income scale and better able to finance a home. However, during the recovery, home buying by people holding student debt took a nosedive. Similarly, people saddled with student debt were less likely to borrow to buy a car. In short, rising student debt pushed out other consumer loans.

Partly, this trend is explained by a tightening of credit standards. The Fed study shows that while in the past scores were not much impacted by student borrowings, during the recession that changed. By 2012, the average credit score for a 25-year old without student debt was 15 points higher than the counterpart who took out college loans; for those aged thirty, the gap was 24 points. The Fed study concludes that while “highly skilled young workers” have always been a boon to the economy, “unprecedented student debt may dampen their influence in today’s marketplace.”

Families are wising up to the “value proposition” of a college degree, checking out websites like to compare future earnings and real costs. The Obama administration has also acknowledged the need for more discriminating choices, while still pushing broader college enrollment.


The biggest reform of all, and the only one that might restore sanity and discipline to our higher education system, is to return student lending to the private sector – this time without the carte blanche of federal guarantees. To give low-income families a shot at attending college, direct grants awarded by scholastic merit would be, in the end, a less distorting approach.

Most likely, Congress will extend today’s Stafford loan rate for another year, and that pile of cans down the road will continue to build.



Top 10 College Majors

Princeton Review




College offers you many academic freedoms. You can cultivate existing passions and explore new interests–all the while figuring out which major will eventually help you earn a living.

Whatever major you choose, don’t pick what’s easiest–or what your best friend is studying–because you’ll only be cheating yourself out of some great opportunities!

And college is, after all, about opportunities.

The Princeton Review’s Top 10 Majors follow. Be warned, however, that these are not necessarily the degrees that garner the most demand in the job market. More importantly, they don’t lock you into a set career path. Each major offers unique intellectual challenges and develops skill sets that will be applicable to various careers.


1. Business Administration and Management/Commerce

Think you’re a born leader? You’ll need stellar people skills–no room for wallflowers here–and talents in problem solving, number crunching, and decision making. And don’t forget great communication skills! While studying business, you’ll get a thorough grounding in the theories and principles of accounting, finance, marketing, economics, statistics, and human resources functions. You will be a whiz on how to budget, organize, plan, hire, direct, control, and manage various kinds of organizations –from entrepreneurial–type start–ups to multi–million–dollar corporations. This major will also get you thinking about issues such as diversity, ethics, politics, and other dynamics that play a role in every work environment. Make sure those competitive juices are flowing; the business world is all, well, business.

2. Psychology

If you find yourself delving into why certain people react to certain aspects of their environments in a certain way, then studying psychology will help you learn about the biology of our brains. Psychology majors focus on such features of the human mind as learning, cognition, intelligence, motivation, emotion, perception, personality, mental disorders, and the ways in which our individual preferences are inherited from our parents or shaped by our environment. Within the field, psychologists seek to educate, communicate, and resolve many of the problems surrounding human behavior.

3. Nursing

Compassionate individuals with a great mind for the intricate–and sometimes heartbreaking–world of medicine will be well–suited for a nursing career. In the course of evaluating, diagnosing, and treating health problems there is also the chance to work with ever–evolving and ultra–sophisticated technology. Nursing majors take the traditional science and liberal arts courses as a first–year student and begin clinical rotations at hospitals and other health care facilities during the second semester of their sophomore year. Certification exams are required after graduation from an accredited nursing program before you can be officially registered. And the job prospects for nurses are not only plentiful but also varied, available in fields such as geriatrics, neurology, oncology, obstetrics, and pediatrics.

4. Biology/Biological Sciences

From microscopic organisms to cloning procedures, biology encompasses pretty much the whole world. Biology majors can study human, plants, animals, and the environments in which they live, and studies are conducted at the cellular level, the ecosystem level, or anywhere in between. You might find yourself looking to uncover secrets and for ways to solve problems, such as finding a cure for a disease. Biology majors might find themselves in med school or in one of many growing fields such as genetics and biotechnology or working as a veterinarian, optometrist, ecologist, or environmentalist.

5. Education

Patience. Creativity. Dedication. Enthusiasm. Compassion. Education majors tend to have an abundance of all of these traits. In this major, you’ll learn the skills necessary to become an effective and inspirational teacher with the ability to influence young children and teenagers when they are most impressionable. Although much of the coursework will be general education material, most states require you to choose a specific grade level you’d like to teach. When you are done with coursework, you’ll find yourself in the classroom as a student teacher. This practicum lasts from one semester to a full academic year.

6. English Language and Literature

If you find yourself generally immersed in some book–anything from Shakespeare to Hemingway to Jack Kerouac–you will likely find others just like you in the English department studying the trochaic octameter of Edgar Allan Poe’s “The Raven,” the stunning word choices of narrative nonfiction author Annie Dillard, or the experimental elements of the writings of Walter Abish. English programs focus on literature, language, and writing, and an English major will encounter a wide array of absorbing works of fiction, poetry, and nonfiction from around the world and throughout history. Analyzing the works of the greatest minds and imaginations that human civilization has produced will surely sharpen your critical, emotional, creative, and moral faculties. The study of literature also helps to shed some light on the answers to the enduring questions of the human condition. This degree is tremendous preparation for a future in law, journalism, publishing, graduate studies, and just about anything else.

7. Economics

Economics is the study of choices–those of individuals, businesses, governments, and societies and how they choose to spend their time and money and otherwise allocate their resources. And you guessed it: Economics involves heavy doses of critical thinking and math. This study of the production, distribution, and consumption of goods and services is an indispensable tool for making sense of the intricacies of the modern world. It is also an excellent preparation for a future in business, as well as for graduate studies in law, public policy, and international studies.

8. Communications Studies/Speech Communication and Rhetoric

Communications majors tend to be great storytellers with quick wits and fiery personalities. You’ll spend a significant amount of time scrutinizing different kinds of presentations–such as speeches and scripts–and the strategies behind the messages that speakers and writers use to make their points. You’ll learn about verbal and nonverbal messages, audience reaction, and the varied effects of different communication environments. It will prepare you for a wealth of careers in business, advertising, human resources, public relations, government, education, media, and social services.

9. Political Science and Government

Because it often deals with current events and sophisticated statistical analysis, political science is timely, fascinating, and perpetually changing. In a nutshell, it’s the study of politics of government, and some of the common concentrations are American government, public policy, foreign affairs, political philosophy, and comparative government. Political science majors develop excellent critical thinking and communication skills, and more broadly, an understanding of history and culture. There will be lots of reading, writing, and math. Possible career paths are diverse–from lawyer to politician to journalist.

10. Computer and Information Sciences

Not only will you learn more about computers–hardware and software–but you’ll also learn about the applications of such knowledge, such as how technology fits into a business scenario. You’ll be exposed to areas such as robotics, natural language recognition programs, artificial intelligence, programming languages, numerical analysis, and gaming technology. Problem solving is a major component of CIS, no matter which segment of the industry you want to pursue.



10 Best College Majors for a Lucrative Career

By Caitlin Dewey


Many Millennials grew up hearing that they should study what they love. While that’s a nice sentiment, it’s also landed countless recent grads in quagmires of student debt and unemployment. In today’s tough economic climate, some college majors simply offer better prospects than others—and savvy students should want to know the difference.

That’s why we came up with our list of the ten best college majors for your career. We analyzed the unemployment rates and salaries for graduates of the 100 most popular college majors, using data from Georgetown University’s Center on Education and the Workforce and

What did we look for? Fields of study with grads—both recent grads within the past five years and those well into their careers—who enjoy an attractive combination of big paychecks and abundant employment opportunities. The undergraduate programs that we ranked can take from two to five years to complete.

1. Pharmacy and Pharmacology

Unemployment rate: 3.2%
Unemployment rate for recent grads: 5.4%
Median salary: $105,000
Median salary for recent grads: $51,200
Projected job growth for this field, 2010–2020: 25%
A pharmacy major can be a bit of an investment, since most states require students to earn a post-grad degree to work as a pharmacist. However, pharmacologists, who don’t necessarily require graduate training, can land jobs right out of college. Public and private labs hire recent grads to research drugs and drug interactions. But whether pharmacy undergrads go on to med school, research or some related field, they can expect to earn big salaries fast. Pharmacists stand to make six figures working in hospitals and stores. Both programs involve pharmacology, toxicology and ethics classes—and promise plenty of jobs when class lets out. Even among recent grads, unemployment is a low 5.4%.

2. Nursing

Unemployment rate: 2.2%
Unemployment rate for recent grads: 4.0%
Median salary: $60,000
Median salary for recent grads: $48,000
Projected job growth for this field, 2010–2020: 26%
Nursing has always made for a steady, well-paid career, but it’s looking especially good as baby-boomers age. Demand for nurses is way up, so unemployment, even among new grads, is down. Nursing majors study a predictable list of health care subjects including anatomy, physiology and biology. While you can score an R.N. certification after two years, nurses with bachelor’s degrees generally enjoy better salaries, earning potential and advancement opportunities than their less-educated peers.

3. Transportation Sciences and Technology

Unemployment rate: 4.4%
Unemployment rate for recent grads: N/A
Median salary: $68,000
Median salary for recent grads: $53,100
Projected job growth for this field, 2010–2020: 5%
Head in the clouds? Hardly a bad thing. Aeronautics and aviation technology majors can expect to earn more than $50,000 right out of school—and as much as $90,000 midway through their careers. Most study engineering, mechanics and aerodynamics and work for airplane manufacturers. Depending on the program, transportation sciences can also include fields such as materials engineering and automotive-technology management.


4. Treatment Therapy Professions

Unemployment rate: 2.6%
Unemployment rate for recent grads: 5.4%
Median salary: $62,000
Median salary for recent grads: $60,400
Projected job growth for this field, 2010–2020: 33%
While physical therapists typically need a doctorate degree, respiratory, radiation and recreational therapists make the big bucks on a B.A. alone. Recent grads start off at $60,400, one of the highest salaries on our list. Radiation therapists, for example, stand to make $75,000, the national median for that particular therapy field, after a few years. Treatment therapy programs generally include anatomy and physiology courses, as well as chemistry, physics and pharmacology. Regardless of specialty, most therapists work in hospitals or nursing homes.

5. Chemical Engineering


Unemployment rate: 3.8%
Unemployment rate for recent grads: 7.5%
Median salary: $86,000
Median salary for recent grads: $64,500
Projected job growth for this field, 2010–2020: 6%
Chemical engineering majors make more money out of school than any other major on our list. If that’s not enough to send you running for the chem lab, consider the fact that, a few years in, the average chemical engineer will make over $30,000 more than his friends in other fields. The work isn’t easy. Chemical engineers study chemistry, physics and biology in school. But after graduation, these engineers stand to enjoy high five-figure salaries in labs and offices and a very low unemployment rate.

6. Electrical Engineering


Unemployment rate: 5.0%
Unemployment rate for recent grads: 7.3%
Median salary: $86,000
Median salary for recent grads: $57,000
Projected job growth for this field, 2010–2020: 6%
Electrical engineering isn’t for the faint of heart or the mathematically challenged. But if you can survive four years of differential equations and circuit theory, you’re on track to make $57,000 a year at your first job. That’s $20,000 more than the median salary for new grads in the top 100 majors. Long-term job growth is modest, with a 10-year projection below the 14% average for all occupations. Still, consistent demand for qualified electrical engineers keeps unemployment in check. Nearly 300,000 electrical engineers design and test components for manufacturers, engineering firms and power plants across the country.


7. Medical Technologies

Unemployment rate: 1.4%
Unemployment rate for recent grads: 5.4%
Median salary: $58,000
Median salary for recent grads: $45,100
Projected job growth for this field, 2010–2020: 13%
Medical technologists are in serious demand—so serious, in fact, that some hospitals try to tempt recent grads with perks such as sign-on bonuses. In addition to the abundance of employment opportunities, medical-technologies majors can look forward to above-average starting salaries. In school, majors study chemistry, biology and clinical laboratory skills; after graduation, they work in hospitals, doctors’ offices and diagnostic labs analyzing patient samples.


8. Construction Services

Unemployment rate: 5.4%
Unemployment rate for recent grads: N/A
Median salary: $65,000
Median salary for recent grads: $50,200
Projected job growth for this field, 2010–2020: 17%
Construction services may seem an odd choice in a down economy, when building projects can grind to a halt. Still, there’s enough demand for general contractors and construction managers to keep unemployment at a tidy 5.4%. Construction-services majors study project scheduling and construction law and go on to oversee projects ranging from office buildings to power plants. The workplace isn’t as glamorous as a swanky office, but new construction services grads make more money than new grads in finance, general engineering and pre-law.

9. Management Information Systems

Unemployment rate: 4.2%
Unemployment rate for recent grads: 7.4%
Median salary: $71,000
Median salary for recent grads: $51,000
Projected job growth for this field, 2010–2020: 18%
Not all computer majors are created equal, contrary to rumor and admissions-office hype. Computer-networking majors, for instance, see 8.2% unemployment and a $37,300 salary upon graduation. But management information systems majors can expect high starting salaries right out of school, and strong job and salary growth after that. The major prepares students to work in IT for big organizations—helping clueless technophobes fix their e-mail, sure, but also building, securing and maintaining a network for an entire company.


10. Medical Assisting Services

Unemployment rate: 2.9% (Average for all grads with a bachelor’s degree: 4.9%)
Unemployment rate for recent grads: 5.4% (Average for top 100 majors: 7.7%)
Median salary: $51,000 (Median for all grads with bachelor’s: $54,756)
Median salary for recent grads: $43,000 (Median for top 100 majors: $37,000)
Projected job growth for this field, 2010–2020: 31% (Average: 14%)
If you don’t mind following doctors’ orders, medical assisting is a pretty sweet deal. The average medical assistant with a two-year associate’s degree will enjoy far better job prospects than most grads and earn nearly as much money as a young B.A.-holder. Medical-assisting majors study office administration and basic clinical skills, such as transcription, coding and lab procedures. They generally work in doctors’ offices, taking patient history, performing basic tests, and tracking insurance and other paperwork as needed.


Is Big Data turning government into ‘Big Brother?’

San Francisco Chronicle     

By MICHAEL LIEDTKE, AP Technology Writer

Updated 3:46 am, Friday, June 7, 2013

SAN FRANCISCO (AP) — With every phone call they make and every Web excursion they take, people are leaving a digital trail of revealing data that can be tracked by profit-seeking companies and terrorist-hunting government officials.

The revelations that the National Security Agency is perusing millions of U.S. customer phone records at Verizon Communications and snooping on the digital communications stored by nine major Internet services illustrate how aggressively personal data is being collected and analyzed.

Verizon is handing over so-called metadata, excerpts from millions of U.S. customer records, to the NSA under an order issued by the secretive Foreign Intelligence Surveillance Court, according to a report in the British newspaper The Guardian. The report was confirmed Thursday by Sen. Dianne Feinstein, D-Calif., who chairs the Senate Intelligence Committee.

Former NSA employee William Binney told the Associated Press that he estimates the agency collects records on 3 billion phone calls each day.

The NSA and FBI appear to be casting an even wider net under a clandestine program code-named “PRISM” that came to light in a story posted late Thursday by The Washington Post. PRISM gives the U.S. government access to email, documents, audio, video, photographs and other data that people entrust to some of the world’s best known companies, according to The Washington Post. The newspaper said it reviewed a confidential roster of companies and services participating in PRISM. The companies included AOL Inc., Apple Inc., Facebook Inc., Google Inc., Microsoft Corp., Yahoo Inc., Skype, YouTube and Paltalk.

In statements, Apple, Facebook, Google, Microsoft and Yahoo said they only provide the government with user data required under the law. (Google runs YouTube and Microsoft owns Skype.) AOL and Paltalk didn’t immediately respond to inquiries from The Associated Press.

The NSA isn’t getting customer names or the content of phone conversations under the Verizon court order, but that doesn’t mean the information can’t be tied to other data coming in through the PRISM program to look into people’s lives, according to experts.

Like pieces of a puzzle, the bits and bytes left behind from citizens’ electronic interactions can be cobbled together to draw conclusions about their habits, friendships and preferences using data-mining formulas and increasingly powerful computers.

It’s all part of a phenomenon known as a “Big Data,” a catchphrase increasingly used to describe the science of analyzing the vast amount of information collected through mobile devices, Web browsers and check-out stands. Analysts use powerful computers to detect trends and create digital dossiers about people.

The Obama administration and lawmakers privy to the NSA’s surveillance aren’t saying anything about the collection of the Verizon customers’ records beyond that it’s in the interest of national security. The sweeping court order covers the Verizon records of every mobile and landline phone call from April 25 through July 19, according to The Guardian.

It’s likely the Verizon phone records are being matched with an even broader set of data, said Forrester Research analyst Fatemeh Khatibloo.

“My sense is they are looking for network patterns,” she said. “They are looking for who is connected to whom and whether they can put any timelines together. They are also probably trying to identify locations where people are calling from.”

Under the court order, the Verizon records include the duration of every call and the locations of mobile calls, according to The Guardian.


DoD close to approving cyber attack rules

May. 28, 2013 – 06:00AM |


WASHINGTON — After three years of grueling internal debate, the chairman of the Joint Chiefs is poised to approve new rules empowering commanders to counter direct cyberattacks with offensive efforts of their own — without White House approval.

Once signed, the new cyber rules contained in the US military’s new standing rules of engagement (SROE) — the classified legal document that outlines when, how and with what tools America will respond to an attack — will mark a far more aggressive tack than envisioned when the process started in 2010, or even much more recently. To date, any cyber action requires the approval of the National Security Council (NSC).

A defense spokesman said that much of the focus on cyber has revolved around defensive action, and that pre-emptive offensive action would still require presidential approval.

Sources said the new rules are vital to address a rapidly developing domain that should be integrated into normal military rules, but still remains largely closed to outside observers by heavy layers of classification. Because the SROE is classified, conversations about its composition and details of deliberations are all considered very sensitive, and sources who participated declined to be named.

The new rules were supposed to have been implemented in late 2010, but were delayed as top government lawyers debated how aggressively the US should respond to cyberattacks, and what tools commanders could use, according to current and former White House, defense and intelligence officials.

Now complete, the rules are undergoing a final “internal bureaucratic process,” a defense official said.

Lawyers from the Joint Staff and US Cyber Command (CYBERCOM) gathered in Washington to try to update the Defense Department’s standing rules of engagement in late 2010, with two major policy areas remaining as subjects of debate: rules regarding deployed ships and rules about cyberwarfare.

The cyber discussion resulted in a draft cyber policy that was gerrymandered, larded with legalese, and had become almost unintelligible because of the many hands from multiple agencies involved in its writing. An interagency process had been started because cyber concerns confront a variety of agencies, the intelligence community and DoD as well as State, Homeland Security and other departments, with each expressing views on how the domain would be treated.

That effort aimed to update rules crafted in 2005 that did not address broader questions regarding cyber, but were in need of updates as cyber threats escalated. Recent reports from the security company Mandiant and from DoD indicate the Chinese cyberattacks began to increase in 2006.

With the SROE process having stalled, three lawyers attending the conference decided to start over, redrafting the language on cyber over a lunch break during the conference. Huddled around a table they created what they thought was a simple, clean approach that could gain broad support. They presented it to the other attendees, and the new version was passed up the chain of command for review by senior officers.

Not long afterward, that draft was rejected by a deputy of Gen. Keith Alexander, head of CYBERCOM and director of the National Security Agency, because it fell short of where “the SecDef wanted it to go,” said a former defense official.

The problem was that the document didn’t allow for a sufficiently assertive response, the official added. In its efforts to achieve balance, the draft didn’t accommodate the strong stance the administration, and specifically CYBERCOM, wanted to take.

So the rules were drafted again, designed to be “forward leaning,” permitting a stronger response. Once again they were rejected.

Nearly three years later the rules still haven’t been signed. Defense officials said they expect the newest version to be formalized shortly, but there is always the possibility that further policy concerns will stall the process.

While several sources pointed to the desire by some, especially Alexander, to take a more assertive stance, not everyone agrees that the delay was caused by internal dissent. A senior defense official said the process was slowed by the administration’s need to develop larger cyber policies to make sure the military rules fit the larger whole.

“As we were developing our standing rules of engagement and going through that interagency process we were recognizing that there’s a natural progression, a natural sequencing of making sure that the presidential policy was finalized and signed out, then making sure that the doctrine and other procedures are in place, and finally the next logical step is the standing rules of engagement,” the senior defense official said.

According to the former defense official with knowledge of earlier drafts, the version on the verge of completion is “way far” from previous versions, authorizing far more assertive action than had been previously considered.

Use of cyber weapons will still be the domain of US Cyber Command, with geographic combatant commanders requesting action through locally stationed cyber support elements. But the debate about the rules of engagement, what authorities they should permit and who should have them, stems from a larger issue about normalizing cyberwarfare that was complicated by the concentration of cyber authority within the NSC, a concentration that is the byproduct of an inter-agency dispute dating to the Iraq war.

What the US does as it begins to normalize cyber will have a big effect on how cyber is treated globally, said Jason Healey, director of the Cyber Statecraft Initiative of the Atlantic Council.

“Without a doubt what we do gets copied,” he said. “The fact that we’re including this in rules of engagement and pushing this down to lower levels, [means that] then the military of another country will try to convince its leaders to do the same thing.”


Concentration of Power

In 2003, with the launch of the war in Iraq, cyber capabilities weren’t very advanced compared to some of the elegant tools at the military’s disposal today. But that doesn’t mean that various intelligence and defense agencies weren’t interested in using them.

When the squabbling over who would be in charge of cyber began, President George W. Bush signed a classified presidential directive in 2004 requiring that all cyber decisions be funneled through the NSC.

That prevented any single agency from laying claim. But it didn’t end the disagreements.

“It became an issue with cabinet and deputy cabinet level officials in there hacking it out,” said a former senior intelligence official, describing debates in the White House Situation Room.

In every instance where cyber was involved, the NSC had to be involved. That helped settle some of the disputes between agencies by limiting any independent application of cyber capabilities, but was useful neither for expediting any cyber action nor for integrating cyber into larger military capabilities. Several sources said that this has slowed the integration of cyber into broader military tactics, possibly giving rivals without the same hesitation, like China, a chance to become more adept at military cyber.

Some decisions by the NSC on the use of cyber were easier than others. In an individual theater of combat, such as Afghanistan, their use was more easily authorized if the effects were limited to the region. If anything resembling a cyberattack or intrusion came from the area, a response was also likely authorized.

But when it came to more complicated issues, like international intrusions, the standards got hazy.

Because every decision had to be run through the West Wing, potential political blowback limited the use of cyber tools, the former senior intelligence official said. “If they can’t be used without a discussion in the West Wing, the president’s got no place to run if something goes wrong when he uses them,” he said. Those decisions included what to do if the US confronted a cyberattack.

The rules of engagement review proceeded in 2005 with limited cyber concerns integrated into the final version. Not until 2010 did the larger debate pick up steam.

The rejection of the drafts developed at the end of 2010 by CYBERCOM officials was part of a larger push to increase the authority vested in Alexander, the former senior intelligence official said. “When we had these dialogues with the Fort Meade population, it was often the rest of the intelligence community cautioning the Fort Meade guys not to be so aggressive,” he said. NSA and CYBERCOM are at Fort Meade in Maryland.

Several sources cited these interests as slowing the process, and causing several compromises to be rejected.

Not everyone agrees that the process has been slowed by dissent or efforts to increase authority by any one group. The senior defense official who described the delays as being the result of larger policy development pointed to the difficulty in crafting a new policy in a new area of warfare.

“It was much less about a turf war than it was about us wanting to make sure that the department’s role was right in defending it, and that the level to which the authority was delegated was appropriate and something with which the secretary and the chairman and the White House was comfortable,” he said. “If this is the first time ever that we’re talking about SROEs that are outside of DoD networks, it should be expected that it’s a very complicated thing. There’s no precedent, there’s no clear understanding on some of the issues.”

A defense spokesman who was asked about Alexander’s role in eliminating earlier versions of the cyber language noted that there were multiple officials involved in the development process.

“The standing rules of engagement are a product of many minds, of which Gen. Alexander is one,” a statement from the spokesman read. “He has worked tirelessly with senior department leadership to develop appropriate SROEs that for the first time will define the legal framework for how the United States would respond if attacked by, through or with the cyber domain.”

To be sure, even when an SROE document is signed, it will not grant the authority to wage cyberwar to low level military personnel. Even the cyber capabilities that might be employed to respond to an attack will require orders from senior officials.

But the document is a move that begins to standardize cyber, folding some areas into more typical military rules and hashing out concerns about how cyber should be treated.

The use of cyber is more a question of political influence in the West Wing, a process that favors those like Alexander who have access to decision-makers. If cyber capabilities become more readily accepted, their implementation could become more democratic, based more on need than on politics.

More importantly, by authorizing immediate action against cyberattacks, the SROE will greatly cut down on the reaction time. By eliminating the often laborious process of NSC deliberations, an attack will likely be countered sooner and potentially result in less damage.

“If you have time to run it through the NSC you don’t really need a standing requirement,” a former defense official said.

With troops and techies, U.S. prepares for cyber warfare

By Warren Strobel and Deborah Charles

WASHINGTON | Fri Jun 7, 2013 3:11am EDT

(Reuters) – On the site of a former military golf course where President Dwight Eisenhower once played, the future of U.S. warfare is rising in the shape of the new $358 million headquarters for the military’s Cyber Command.

The command, based at Fort Meade, Maryland, about 25 miles north of Washington, is rushing to add between 3,000 and 4,000 new cyber warriors under its wing by late 2015, more than quadrupling its size.

Most of Cyber Command’s new troops will focus on defense, detecting and stopping computer penetrations of military and other critical networks by America’s adversaries like China, Iran or North Korea.

But there is an increasing focus on offense as military commanders beef up plans to execute cyber strikes or switch to attack mode if the nation comes under electronic assault.

“We’re going to train them to the highest standard we can,” Army General Keith Alexander, head of Cyber Command, told the Reuters Cybersecurity Summit last month. “And not just on defense, but on both sides. You’ve got to have that.”

Officials and experts have warned for years that U.S. computer networks are falling prey to espionage, intellectual property theft and disruption from nations such as China and Russia, as well as hackers and criminal groups. President Barack Obama will bring up allegations of Chinese hacking when he meets President Xi Jinping at a summit in California beginning on Friday – charges that Beijing has denied.

The Pentagon has accused China of using cyber espionage to modernize its military and a recent report said Chinese hackers had gained access to the designs of more than two dozen major U.S. weapons systems in recent years. Earlier this year, U.S. computer security company Mandiant said a secretive Chinese military unit was probably behind a series of hacking attacks that had stolen data from 100 U.S. companies.

There is a growing fear that cyber threats will escalate from mainly espionage and disruptive activities to far more catastrophic attacks that destroy or severely degrade military systems, power grids, financial networks and air travel.

Now, the United States is redoubling its preparations to strike back if attacked, and is making cyber warfare an integral part of future military campaigns.

Experts and former officials say the United States is among the best – if not the best – in the world at penetrating adversaries’ computer networks and, if necessary, inserting viruses or other digital weapon.

Washington might say it will only strike back if attacked, but other countries disagree, pointing to the “Stuxnet” virus. Developed jointly by the U.S. government and Israel, current and former U.S. officials told Reuters last year, Stuxnet was highly sophisticated and damaged nuclear enrichment centrifuges at Iran’s Natanz facility.



U.S. government officials frequently discuss America’s cyber vulnerabilities in public. By contrast, details about U.S. offensive cyberwarfare capabilities and operations are almost all classified.

Possible U.S. offensive cyber attacks could range from invading other nations’ command and control networks to disrupting military communications or air defenses – or even putting up decoy radar screens on an enemy’s computers to prevent U.S. aircraft from being detected in its airspace.

The shift toward a greater reliance on offense is an important one for a nation which has mostly been cautious about wading into the uncertain arena of cyberwar – in part because gaps in U.S. cybersecurity make it vulnerable to retaliation.

But former Homeland Security Secretary Michael Chertoff said the United States must be ready and should articulate – soon – what level of cyber aggression would be seen as an act of war, bringing a U.S. response.

“One of the things the military learned, going back to 9/11, is whether you have a doctrine or not, if something really bad happens you’re going to be ordered to do something,” he told the Reuters summit. “So you better have the capability and the plan to execute.”

Reuters has learned that new Pentagon rules of engagement, detailing what actions military commanders can take to defend against cyber attacks, have been finalized after a year of “hard core” debate. The classified rules await Defense Secretary Chuck Hagel’s signature, a senior defense official said.

The official would not give details of the rules but said, “they will cover who has the authority to do specific actions if the nation is attacked.”



At Cyber Command, military officers in crisp uniforms mix with technical experts in T-shirts as the armed forces takes up the challenge of how to fend off cyber penetrations from individuals or rival countries.

Even as overall U.S. defense spending gets chopped in President Barack Obama’s proposed 2014 budget, cyber spending would grow by $800 million, to $4.7 billion while overall Pentagon spending is cut by $3.9 billion.

Until its new headquarters is ready, Cyber Command shares a home with the U.S. National Security Agency (NSA), which for 60 years has used technological wizardry to crack foreign codes and eavesdrop on adversaries while blocking others from doing the same to the United States. Alexander heads both agencies.

“The greatest concentration of cyber power in this planet is at the intersection of the Baltimore-Washington Parkway and Maryland Route 32,” said retired General Michael Hayden, a former CIA and NSA director, referring to NSA’s Fort Meade location.

But NSA’s role in helping protect civilian, government and private networks has been controversial – and is likely to come under greater scrutiny with this week’s revelation that it has been collecting telephone records of millions of Verizon Communications customers under a secret court order.

A January report by the Pentagon’s Defense Science Board gave a general picture of how the United States might exploit and then attack an adversary’s computer systems.

In some cases, U.S. intelligence might already have gained access for spying, the report said. From there, Cyber Command “may desire to develop an order of battle plan against that target” and would require deeper access, “down to the terminal or device level in order to support attack plans,” it said.

Because gaining access to an enemy’s computers for sustained periods without detection is not easy, “offensive cyber will always be a fragile capability,” it said.

In cyberspace, reconnaissance of foreign networks is “almost always harder than the attack” itself because the challenging part is finding a way into a network and staying undetected, said Hayden, now with the Chertoff Group consulting firm.



Cyber Command’s new Joint Operations Center, due to be complete in 2018, will pull disparate units together and house 650 personnel, officials said. Air Force, Army, Navy and Marine Corps components will be nearby and, a former U.S. intelligence official said, the complex will have power and cooling to handle its massive computing needs.

Those who have worked at Cyber Command say the atmosphere is a mixture of intensity and geek-style creativity. Military precision is present, but it is not unusual to see young civilian computer whiz kids with purple hair, a tie-dyed shirt and blue jeans.

“It’s made to be a fun environment for them. These are people who are invested and want to serve their nation. But there is some military rigor and structure around all that – like a wrapper,” said Doug Steelman, who was director of Network Defense at Cyber Command until 2011 and is now Chief Information Security Officer at Dell SecureWorks.

Cyber Command’s growth and expanding mission come with serious challenges and questions.

For example, how to prevent U.S. military action in cyberspace from also damaging civilian facilities in the target country, such as a hospital that shares an electric grid or computer network with a military base?

And some doubt that the military can train many cyber warriors quickly enough. Alexander has identified that as his biggest challenge.

The former intelligence official said Cyber Command’s new teams won’t be fully ready until at least 2016 due to military bureaucracy and because it takes time to pull together people with the special skills needed.

“To be a good cyber warrior, you have to be thinking, ‘How is the attacker discovering what I’m doing? How are they working around it?’ … Cyber security really is a cat and mouse game,” said Raphael Mudge, a private cybersecurity expert and Air Force reservist. “That kind of thinking can’t be taught. It has to be nurtured. There are too few who can do that.”

Would-be cyber warriors go through extensive training, which can take years. A recruit with proven aptitude will be sent to courses such as the Navy-led Joint Cyber Analysis Course in Pensacola, Florida, a 6-month intensive training program.

The top 10 percent of JCAC’s students will be selected for advanced cyber operations training, said Greg Dixon, a vice president at private KEYW Corp, which conducts intensive training classes.

The company can train a JCAC graduate to become an analyst in five weeks, but it takes 20 weeks to become a cyber operator. Dixon would not divulge what an operator would be capable of doing after graduation, but said it would be “a lot.”

“They’re going to pick the cream of the crop for the ‘full spectrum cyber missions’,” the former U.S. intelligence official said, using a euphemism for cyber offense.

Before a future cyber warrior can begin advanced training, he or she has to pass through the arduous security clearance process, which can take six to nine months for personnel who are not already cleared.

Troops earmarked for cyber warfare have found themselves washing floors, mowing lawns and painting at military installations as they bide time waiting for a clearance.

There is the concern about retaliation for a U.S. cyber attack. Some analysts say Iran increased its cyber capabilities after being infected with Stuxnet, which was revealed in 2010.

“The old saying, he who lives in a glass house should be careful of throwing stones … but if the stone that you threw at someone, when you live in a glass house, is a stone that in some way they could pick back up and throw back at you, that’s an even dumber idea,” the defense official said. “We definitely think about that as one aspect of considering action.”

(Reporting by Warren Strobel and Deborah Charles; Additional reporting by Andrea Shalal-Esa and Phil Stewart; Editing by Alistair Bell and Tim Dobbyn)


Expectations Set Low on Obama-Xi Summit

Presidents of U.S., China Set to Discuss Cybersecurity

By Eric Chabrow, June 6, 2013. Follow Eric @GovInfoSecurity


Only a cockeyed optimist would expect the outcome of this weekend’s summit between President Obama and Chinese President Xi Jinping to be a halt to Chinese cyber-attacks on U.S. computers.

“Expectations on outcome should be low, but even a joint announcement decrying state-sponsored hacking would be a symbolic step in the right direction,” says Michael DuBose, a managing director at risk-mitigation adviser Kroll Advisory Solutions.

Cybersecurity is at the top of the agenda for the June 7-8 summit. What will emerge from two days of meetings isn’t likely to be an acknowledgment from Xi that, indeed, hackers from China have attacked American computer systems to pilfer military and corporate trade secrets. Instead, Obama and Xi will seek to draft an agenda for a newly formed U.S.-Chinese cybersecurity working group that will convene next month.

In addition to cybersecurity, Obama and Xi will address North Korea’s nuclear threat, territorial and maritime disputes in the western Pacific, human rights and bilateral and global economic challenges, all in formal and informal sessions at a 200-acre retreat called Sunnylands in Rancho Mirage, Calif.

A desired outcome of the summit would be the development of an amiable rapport between Obama and Xi, who assumed the presidency just months ago. [The two leaders met in the Oval Office last year when Xi was Chinese vice president.]


Less Scripted, Less Formal, Less Rigid

“Getting to a venue like Sunnylands allows for a more informal set of discussions than we’ve had with China to date in the sense that it’s a less scripted, less formal, less rigid agenda, but rather there is some space for the two leaders to interact and have more open-ended discussions about the issues that underlie the U.S.-China relationship,” a senior administration official said at a briefing on the summit earlier this week.

Some of the administration’s top officials – National Security Adviser Tom Donilon, who announced his resignation this week, Defense Secretary Chuck Hagel and Secretary of State John Kerry – have raised concerns about cyber-attacks originating in China directly with senior Chinese leaders, who have denied such attacks have occurred. In fact, published reports this past week from the leader of the Chinese equivalent of computer emergency response teams said China can document attacks originating from the United States against its computers.

Still, the evidence is strong that the Chinese government is behind the attacks that have stolen military secrets and intellectual property from American computers [see 6 Types of Data Chinese Hackers Pilfer, DoD Outlines China's Spying on U.S. IT and Chinese Hack Targets Weapons' Designs].


Constructive, Not Confrontational Tone

Because of those digital invasions, Obama will not hesitate to express his displeasure about these cyber-attacks to Xi, but the tone will likely be constructive, not confrontational.

“We will make clear that, frankly, it’s not in anybody’s interest for there to be a situation in which businesses don’t have the confidence that they have certain protections in place such that their intellectual property can’t be compromised, and sensitive data can’t be compromised,” said one of the administration officials, whose identities remain anonymous as a condition of the briefing.

Administration officials declined to say whether Obama would threaten XI with some type of punishment if the Chinese hacking continues. But one of the officials at the briefing pointed out that the United States has successfully won economic cases against China before the World Trade Organization. “We have demonstrated that when we believe that we’re not making progress simply through dialogue, we’re willing to use the measures available to us within the international system to elevate those concerns,” the official said.


Strong Incentive for Dialogue

With the Internet being a crucial component of the global economy, much is at stake if the U.S. and China can’t reach agreement on how to secure it. “The two largest economies in the world have a lot to lose from state-sponsored and profit-motivated hacking, whether it involves trade secrets, key market intelligence or victimization of their citizenry,” Kroll’s DuBose says. “With so much at risk, both sides have strong incentive to come to the table, but trust and accountability will continue to be huge challenges. This summit is a step in the right direction, but expectations should be measured.”

Yet, even if the cybersecurity talks between Obama and Xi are deemed a success, challenges to secure the Internet will remain. “Cyber-conflicts are a global governance issue and is not an issue that can be resolved with bilateral talks between two countries,” says Ashar Aziz, founder of FireEye, a provider of IT security wares. “The number of countries and non-state groups with sophisticated offensive capabilities in cyber-space is growing at an alarming rate. So even if one or two countries decide to show restraint, it is difficult to see how that will result in fewer attacks on the U.S. and other countries given the global and highly distributed nature of the problem.”



Bits – Business, Innovation, Technology, Society

Robbing a Gas Station: The Hacker Way



June 6, 2013, 3:07 pm

Thieves of the future will look back on today’s stick-up artists and have a good old belly laugh. Why would anyone ever rob a cashier with a gun, when all that is needed is a smartphone?

Matt Bergin, a security consultant at Core Security, discovered he could hack a cash register remotely, popping it open, by sending two digits from his smartphone to the service running on the cash register’s point-of-sale system. No gun or holdup note was required. He was able to do so through a vulnerability in Xpient, which makes point-of-sale software that runs on cash drawers.

“It was extremely trivial,” Mr. Bergin said in an interview Wednesday. He reverse-engineered Xpient’s point-of-sale system, expecting that to interact with it he would have to crack a password or break through a layer of encryption. To his surprise, he encountered neither. By simply sending a two-digit code from his phone to the point-of-sale system, he discovered he could pop open the cash register remotely.

Christopher Sebes, the chief executive of Xpient, said in an interview Thursday that the company had issued a patch for the vulnerability, which Xpient customers can download to their systems. Mr. Sebes noted that customers who had a Windows firewall switched on would be protected from the hack, regardless of whether they had downloaded the patch. He also noted that someone could just as easily pop open a cash register by physically hitting the “No Sale” button on the register itself.

Increasingly, criminals are finding ways to use digital tactics for physical theft. In February, thieves stole $45 million from thousands of New York City A.T.M.’s in a few hours using a few keystrokes. It was one of the largest heists in New York City history, the authorities said, on par with the 1978 Lufthansa robbery at Kennedy Airport that inspired a scene in the 1990 film “Goodfellas.”


Iowa City could be among first in nation to ban drones

Jun. 6, 2013 |

Written by Adam B Sullivan

Iowa City Press-Citizen

Iowa City will be among the first municipalities in the country to adopt a ban on drones and other surveillance devices, but city leaders are making clear that they hope the ordinance isn’t permanent.

The Iowa City Council on Tuesday unanimously approved the first reading of an ordinance to ban drones, automated traffic cameras and license-plate readers. That comes after thousands of local residents signed a petition calling for such a ban.

All seven council members supported the ordinance Tuesday night and indicated they’ll do the same June 18 when the measure comes up for final approval in two weeks. However, council members said they still want to install some of those devices, red-light cameras in particular. They said they’ll wait until state policymakers finish reviewing possible restrictions on cameras.

Because the ordinance comes by way of public petition, city rules require the ordinance to be kept in place for two years, city clerk Marian Karr said. After that, the council is free to repeal it and move forward with surveillance technology.

“I’m going to have to support it — ‘have to’ is the key word there — because our state has chosen not to move forward with the technology. … I’ll be the first one to bring back red-light cameras as soon as we can,” council member Terry Dickens said.

A growing number of municipalities in Iowa have begun using cameras to bust speeders and other traffic violators. Supporters say it boosts traffic safety and provides extra revenue for municipalities. Opponents, however, say it constitutes an invasion of privacy and a slippery slope toward more surveillance.

The Iowa Department of Transportation is delaying new cameras on state roadways while officials study possible restrictions. Some state lawmakers have unsuccessfully pushed to ban the devices outright.

One constituent spoke against the camera ban at Tuesday night’s meeting.

“You could get millions of dollars just by enforcing the traffic laws on our streets — it costs you nothing. There are companies that will come in and do it for you,” said Jim Walters, who yelled during his time at the podium and interrupted the meeting after he took his seat.

Drones, meanwhile, have earned growing notoriety in light of the federal government’s drone strikes overseas, but military-style unmanned aircraft aren’t known to be in use in Iowa. That could change eventually as state and local governments elsewhere have started to use unmanned aircraft for domestic operations such as monitoring weather or, in a few cases, helping to locate suspects.

A small group of Iowa Republican legislators this year pushed for a statewide ban on weaponized drones and restrictions for peaceful drones, but that bill failed to gain wide support and Democrat leaders opted not to move the bill forward.

Ben Stone, executive director of the American Civil Liberties Union of Iowa, which has worked with Iowa City organizers on the ban, said he doesn’t know of any other municipalities with drone bans. Iowa City’s drone ban will be among the first in the nation, he said.

“There are a lot of laws and ordinances against traffic cameras, but in terms of drones, it’s just so early in the proliferation of that technology,” Stone said.

Aleksey Gurtovoy and Martha Hampel — the two organizers who led the petition drive for the ban — were at Tuesday night’s City Council meeting. They said afterward that they weren’t surprised by the council’s plans to move forward with cameras at a later time.

They said the council’s decision to adopt the proposed ban was a political one.

“We had no illusions as far as their intentions,” Gurtovoy said. “It’s definitely in the city’s interest do this, especially with three of them up for re-election this year.”



As Wars End, a Rush to Grab Dollars Spent on the Border



June 6, 2013


TUCSON — The nation’s largest military contractors, facing federal budget cuts and the withdrawals from two wars, are turning their sights to the Mexican border in the hopes of collecting some of the billions of dollars expected to be spent on tighter security if immigration legislation becomes law.

Half a dozen major military contractors, including Raytheon, Lockheed Martin and General Dynamics, are preparing for an unusual desert showdown here this summer, demonstrating their military-grade radar and long-range camera systems in an effort to secure a Homeland Security Department contract worth as much as $1 billion.

Northrop Grumman, meanwhile, is pitching to Homeland Security officials an automated tracking device — first built for the Pentagon to find roadside bombs in Afghanistan — that could be mounted on aerial drones to find illegal border crossers. And General Atomics, which manufactures the reconnaissance drones, wants to double the size of the fleet under a recently awarded contract worth up to $443 million.

The military-style buildup at the border zone, which started in the Tucson area late in the Bush administration, would become all but mandatory under the bill pending before the Senate. It requires that within six months of enactment, Homeland Security submit a plan to achieve “effective control” and “persistent surveillance” of the entire 1,969-mile land border with Mexico, something never before accomplished.

For military contractors, that could be a real boon. “There are only so many missile systems and Apache attack helicopters you can sell,” said Dennis L. Hoffman, an Arizona State University economics professor who has studied future potential markets for the defense industry. “This push toward border security fits very well with the need to create an ongoing stream of revenue.”

Since 2005, the number of Border Patrol agents has doubled to 21,000, and the stretches protected by pedestrian or vehicle fencing have grown to 651 miles as of last year from 135. But there are still large swaths where people trying to enter the United States illegally have good odds of success, particularly in rural Texas. And with budget cutting in the past two years, money for surveillance equipment along the border has been pared back.

“The main gap in our ability to provide a more secure border at this point is technology,” Mark S. Borkowski, the head of acquisitions for Homeland Security’s Customs and Border Protection, told participants at a border security industry conference in March.

Military contractors have not played a significant role in lobbying for the passage of the immigration legislation, which includes $4.5 billion to bolster border security over the next five years.

But teams of lobbyists, including former Senator Alfonse M. D’Amato, a New York Republican, and Benjamin Abrams, a former top aide to Representative Steny H. Hoyer, a Maryland Democrat and House minority whip, have already been pressing Homeland Security officials and lawmakers on behalf of their clients, efforts that have been backed up with millions of dollars of industry campaign contributions.

Homeland Security would have to decide, in consultation with Congress, how to divide the money — on long-range cameras, radar systems, mobile surveillance equipment, aircraft or lower-tech solutions like more border agents or physical fences — decisions that would determine how various contractors might fare.

“It has been a tough time for the industry: people have been laid off or furloughed,” said James P. Creaghan, a lobbyist who represents a small Texas company, Personal Defense, which is trying to sell more night-vision goggles to Homeland Security. “This could help out.”

Northrop has won some important allies on Capitol Hill, including Senator Thomas R. Carper, Democrat of Delaware, the chairman of the Homeland Security and Governmental Affairs Committee, who is urging the department to invest more in Northrop’s drone-mounted surveillance system, called Vader. General Atomics, which Mr. D’Amato represents, has so much support in Congress that it has pressed Homeland Security in recent years to buy more Predator drones than the department has the personnel to operate, so they often sit unused, according to an agency audit.

The specific requirement in the legislation now before the Senate is that Homeland Security must install surveillance equipment or other measures that would allow it to apprehend or turn back 9 out of 10 people trying to illegally enter across all sectors of the southern land border. The department would be prohibited from moving ahead with the “pathway to citizenship” for immigrants already in the United States until this new security strategy is “substantially operational.”

The bill is scheduled to be taken up for debate on the Senate floor next week, and certain Republicans have already drafted amendments that would make the requirement even more demanding, explicitly mandating that the 90 percent standard be achieved before the pathway to citizenship can proceed.

The Tucson area, for years the busiest crossing point for illegal immigrants, has served as the testing ground for the federal government’s high-technology border effort, although even senior Homeland Security officials acknowledge it got off to a poor start.

Boeing was selected back in 2006, when the last major push by Congress to rewrite the nation’s immigration laws was under way, to create a “virtual fence” that would use radar and video systems to identify and track incursions, information that would then be beamed to regional command centers and border agents in the field.

But the ground radar system at first kept shutting down because of faulty circuit breakers, audits found, while the towers installed for the mounting of radar and advanced long-range cameras swayed too much in the desert winds. Even rainstorms snarled things, creating countless false alerts.

“It should have been pretty simple,” Mr. Borkowski said in a recent speech of the troubled $850 million project. “We weren’t frankly smart enough.”

Critics say the government often is too fixated on high-technology solutions. C. Stewart Verdery Jr., a former Homeland Security official who now runs a lobbying firm, said federal officials should instead focus their limited resources on making it harder for illegal immigrants to work in the United States, an approach that would serve as an effective deterrent.

“Where are you going to get the biggest bang for the buck?” Mr. Verdery said. “Enforcement of the workplace is probably the best area to invest more dollars.”

But the technological solutions still have many advocates in Arizona, where Border Patrol officials contend that the equipment Boeing installed, despite its flaws, has fundamentally changed the cat-and-mouse game that plays out every day.

One recent afternoon, as the temperature in the Arizona desert hovered near 100 degrees, Border Patrol agents stationed inside a command center in Tucson were notified that a ground sensor had gone off. The command center, built under the Boeing contract, resembles the set from the Hollywood movie “Minority Report,” with Border Patrol agents sitting in front of banks of computer terminals and oversize screens that allow them to virtually fly over huge expanses of open desert 70 miles away.

Using his computer, one agent pointed the long-range, heat-seeking camera at the location where the sensor had gone off. Within seconds, black-and-white images of a group of men and women walking rapidly through the desert heat appeared on his screen. “One, two, three, four, five,” the agent called out, counting until he reached 15 people in the group. He also carefully scanned the images to see if any of the people were carrying large sacks, a sign of a possible drug delivery, or had any rifles or other weapons.

The Border Patrol radios lit up as he directed nearby agents on the ground to respond and called for backup from one of Customs and Border Protection’s helicopters based in Tucson.

“What you see today is like night and day compared to what we had,” said Cmdr. Jeffrey Self of the Border Patrol, who oversees the Tucson region. The Boeing system, along with the surge in Border Patrol agents, has resulted in a major drop in attempted illegal crossings, he said, with apprehensions dropping 80 percent since their peak in 2000, considered a sign of a drop in overall traffic.

But the system’s weaknesses are still apparent. The computer terminal crashed while the search was under way, cutting off one agent’s video feed. And on that recent afternoon, no air support was immediately available. The one helicopter nearby that was on duty was running low on fuel, so it did not arrive on the scene until 90 minutes later. Meanwhile, the Border Patrol agents at the Tucson command center lost the border crossers as they dropped into a ditch, taking them out of the line of sight of the camera and radar.

Apparently seeing Border Patrol trucks and the helicopter, the group realized it had been spotted and retreated back south, an agency spokesman said. The 15 were marked down as “turn backs.”

Homeland Security has been preparing for more than a year to expand this system, under a new contract that would rely on proven surveillance technology. That is why the military contractors vying for the job will be asked in coming weeks to demonstrate their gear. The department also wants to identify a mix of equipment — some on fixed towers, others on trucks for mobility — so that officials can tailor uses to the different needs along the border.

Department officials said their choices would be driven by a determination of what the best available tools were for securing the border, not what the defense contractors or their lobbyists were pitching. Customs and Border Protection officials, said Michael J. Friel, a department spokesman in a statement, are “dedicated to continuing this progress towards a safer, stronger and more secure border.”




Face of Defense: Woman Becomes Air Force’s Chief Scientist

By Senior Airman Carlin Leslie

Secretary of the Air Force Public Affairs

WASHINGTON, June 6, 2013 – The Air Force has appointed the service’s first female chief scientist to lead the way in the technology and science fields.

The Air Force appointed Dr. Mica Endsley as its first woman chief scientist to lead the way in the technology and science fields. U.S. Air Force photo

Dr. Mica Endsley assumed her new duties and responsibilities as the Air Force’s 34th chief scientist June 3 in support of Air Force senior leaders and airmen across the service.

“Having served on the Air Force Scientific Advisory Board for many years, I’ve had the pleasure of working closely with the current and several former Air Force chief scientists,” Endsley said. “I know this is a tremendous opportunity to help the Air Force excel in its goal of maintaining the critical technological edge that gives our airmen a strategic advantage.”

Air Force Chief of Staff Gen. Mark A. Welsh III emphasized the important role Endsley will play in continuing the Air Force’s legacy of innovation.

“I’m pleased to have Dr. Endsley as a part of the Air Force team,” Welsh said. “She follows in the footsteps of many superb minds that have advanced our technological edge and provided much-needed capabilities to our airmen. Although she arrives at a very challenging time, I’m confident she’ll continue a proud legacy of chief scientists who use innovation and strong leadership to keep our Air Force the world’s finest.”

Successfully maintaining that technological edge Welsh mentioned is a key job, Endsley said, and she plans to use every available resource to effectively and cost efficiently get the job done in support of airmen.

“This involves working with the top scientists and engineers within the Air Force as well as in academia, industry and the other armed services,” she said, noting this will “ensure that the Air Force’s research and development efforts are being directed at the right problems.”

Endsley said she plans to ensure the Air Force continues to develop technologies and systems that will truly support airmen and their missions.

“I know that in many cases, we can dramatically improve our mission effectiveness by using the science of human performance to design technology,” she said.

This, she added, will “better support the way people work.”

As Endsley takes the helm of an office that has made large strides over recent years, she’s motivated to push the envelope even further.

“My goal will be to continue with these efforts, making sure that we are implementing their recommendations and achieving the needed milestones in our science and technology portfolio,” she said. “To stay competitive in the future, we need to make sure that Air Force systems keep up with this rapid pace of change, particularly in computers, cyber and all across the information spectrum.”

Endsley feels that along with the growth of the organization, she has a duty as the first female chief scientist to reach out to the younger generation, speaking on the advantages of a career in science, technology, engineering, and mathematics fields.

“I want to share with the young women I speak to, the many advantages of a science, technology, engineering and math career,” she said, and that it will “make many more interested if they knew how very creative and team-oriented engineering work is and how satisfying it is to be able to solve real problems that affect people’s lives.”

Endsley said she is excited to begin looking across the Air Force, ensuring the needs of airmen are understood and met. At that point she can help bring technology to bear in the right ways to solve the problems they face.

“I deeply respect the challenges and sacrifices that all of our airmen, at every level, make daily in service to our nation,” she said. “To be asked to join them and do what I can to support them was simply an opportunity I could not pass up.”



Shutdown Talk Underlines Spending Bill Split

By Niels Lesniewski    

Roll Call Staff

June 6, 2013, 6:21 p.m.

The White House and Boehner exchanged barbs Thursday over the potential for a shutdown showdown this fall.    

The White House and Speaker John A. Boehner exchanged barbs Thursday over the potential for a shutdown showdown this fall, underscoring the yawning budget gap between the parties that threatens to torpedo this year’s appropriations bills.


The House passed the first fiscal 2014 spending bills this week despite two veto threats, and the Senate is set to mark up funding measures in the coming weeks. But the two chambers are operating off vastly different numbers — given that the House and Senate haven’t come close to reaching a budget deal — setting the stage for another stopgap spending bill this fall and, theoretically, a shutdown fight if the two sides can’t agree.

The House is following a $967 billion spending level that assumes the budget sequester remains in effect. Senate Appropriations Chairwoman Barbara A. Mikulski, D-Md., has said she is moving forward at a $1.058 trillion level that operates on the idea Congress will find a fix for the sequester.

Her GOP counterpart, Sen. Richard C. Shelby of Alabama, said Appropriations Republicans in the Senate want to work off the House number.

“We’ve got a few problems,” an understated Shelby told CQ Roll Call in a brief interview, noting that the committee’s Republicans hadn’t yet decided on a strategy for the markups. “The bottom line is we’re going to stay with the figure, which is the lower figure, the House has agreed with.”

It was just that sort of dispute that seems to have prompted the administration’s unusually broad veto threat on Republican spending bills until there’s a budget agreement.

“In veto threats of two House spending bills — both of which passed with overwhelming support — the White House said the president would not sign any — any — spending bills unless we agree to his demands on a broader budget deal. In short, the president said give him higher taxes and higher spending or we’ll shut down the government,” Boehner said Thursday. “That’s reckless.”

Republican appropriators dismissed the veto threat earlier in the week, but Boehner said it violated his March 1 deal with the president to keep the appropriations bills separate from deficit talks. White House spokeswoman Amy Brundage responded to the Ohio Republican with some snark of her own.

“We were pleased to see Speaker Boehner hold a press conference today to announce the end of the Republican strategy of governing by crisis,” she said. “We look forward to seeing Republicans in Congress act responsibly to pay the bills they have already racked up, along with funding the government to avoid a government shutdown.” But she reiterated that the White House isn’t going to just go along with the Republican budget.

Senate Budget Chairwoman Patty Murray, D-Wash., also responded pointedly to Boehner, renewing her call for a House-Senate conference committee.

“Republicans are refusing to allow us to go to conference for one reason, and that’s because they want to force a manufactured crisis over the debt limit this fall, because they think it will give them leverage,” Murray said. “So let’s be clear about which side is reckless today. Democrats want to get to work right now.”

A number of Senate conservatives have called on Murray to disavow any attempt to raise the debt limit through reconciliation, which has contributed to a regular series of standoffs on the floor.


“That’s always a concern. You’ve got to stay within the numbers, and that’s our concern. We have a difference of about $90 billion between what we think and what the Democrats [think],” Shelby said.

There is a related concern among some lawmakers that the House is moving first on easier-to-pass bills (funding veterans, homeland security and the military), potentially increasing pressure in the months ahead to blow through the caps to pass more contentious domestic spending bills.

“My concern is that we don’t stick to … the budget control agreement that we agreed to. That’s the concern of the House, and there’s some concern that the sequence of bills will ensure — that time-honored practice in the House — the way you sequence them means you bust the budget in the end,” Sen. Jeff Flake said on Wednesday. The Arizona Republican is a frequent critic of the appropriations process.

“That’s always been a concern in marking up appropriation bills. I mean, whichever ones go first tend to get into the money early and then … the later bills get harder and harder to do,” said Sen. John Thune of South Dakota, a member of the Republican leadership.

Still, some appropriators maintained their optimism.

“I am hopeful that we’ll get a significant number of bills passed in the right way. I’m eager to see us get to the point where we’re passing all the bills in the right way,” Sen. Roy Blunt, R-Mo., said. “I hope we make real progress toward that goal this year, and I’m sure that Sen. Mikulski and Sen. Shelby share that.”

Iowa Democratic Sen. Tom Harkin, the longtime chairman of the Senate Labor, Health and Human Services and Education Appropriations Subcommittee who is responsible for what’s among the most difficult bills to pass, said he does not want to get stuck near the end of the line, which has happened in the past. Those departments have frequently operated on stopgap funding in lieu of a full appropriations measure.

“I hope we’re going to do Labor-H sooner rather than later,” Harkin said. “We’re not going to do it last this time.”


New ‘time cloak’ conceals data so well, even its recipients can’t read it

Scientists have discovered how to cloak information so that it appears to have never been sent at all, offering hopes of eventual ultra-secret communications schemes.

The Christian Science Monitor

By Elizabeth Barber, Contributor / June 7, 2013

It sounds like a spectacular failure: this new technology works less than half the time and only for half the parties involved.

But a paper published in Nature today announces that scientists have found a way to unreliably send out communications hidden so well that even their intended recipients can’t detect that they’ve been sent. It is a stunning development in a previously mostly theoretical effort to develop Star Trek-like ultra-secure communication systems.

“The once fanciful invisibility cloak has now assumed a prominent place in scientific research,” according to the paper, authored by researchers at Purdue University.

The possibility that communications could be folded into a ‘time cloak’ was first proposed in a 2010 paper. In that paper, lead author Martin W. McCall proposed a theoretical “space-time cloak” that “conceals events rather than objects,” borrowing light manipulating technology from prototype invisibility clocks to hide the event of sending the message.

That differs from current encryption technology, which conceals the information from unwanted readers but not the actual event of sending the information. Ill-intentioned spies, though unable to read the message, can then tell that a deliberately encrypted message was sent out, a telltale sign that the parties have something to hide.

Or as Joseph Lukens, lead author of the latest research paper, told Nature: “It doesn’t just prevent eavesdroppers from reading your data — they wouldn’t even know there was any data there to hack.”

Previous efforts to put McCall’s theory into practice have made some headway: In 2011, a team headed by Alexander Gaeta, an optical physicist at Cornell University in Ithaca, New York constructed a working time cloak that manipulated laser pulses to hide slow-moving data for extremely brief periods of time.

Now, Purdue scientists have built on that research, discovering a way to cloak data 46 percent of the time at the full speed of common fiber optic networks.

The new technology works by manipulating light behavior so that the waves of light cancel each other each out, creating a temporal gap in a light beam that is subsequently closed up. Anything that occurs during that hole in time cannot be detected – unfortunately, not even by the intended recipients.

For now, this may be a reasonable method of communication for those of us prone to sending messages we later regret. With the ‘time cloak,’ non-receipt is currently guaranteed.


Rasmussen Reports

What They Told Us: Reviewing Last Week’s Key Polls

June 08, 2013

President Obama may have more than his share of problems these days, but things aren’t getting any better for Congress either.

Only six percent (6%) of Likely U.S. Voters give Congress good or excellent marks for the job it is doing. Just 24% believe their own representative in Congress is the best possible person for the job.

Thirty percent (30%) believe their own representative cares what they think. But only 16% think most members of Congress in general care what their constituents think.

Scott Rasmussen says in his latest weekly newspaper column that Congress’ listening problems are making the odds grow longer for immigration reform. “The so-called Gang of Eight proposal in the Senate legalizes the status of immigrants first and promises to secure the border later,” he explains. “By a 4-1 margin, voters want that order reversed.”

Voters haven’t changed their opinions about the motives behind the Justice Department’s actions. Forty-three percent (43%) still believe it was primarily an attempt to intimidate the media. Thirty-four percent (34%) think the department’s actions were primarily out of concern for national security.

The United States was founded on a belief that governments are created to protect certain unalienable rights. Today, however, more voters than ever (56%) view the federal government as a threat to those rights. That’s up 10 points from 46% in December.

Despite the controversies surrounding the White House, the president’s job approval ratings continue to hold fairly steady. Forty-nine percent (49%) of Likely Voters still consider the president a good or excellent leader. That shows little change from the past couple months but is down from a recent high of 55% measured in late December.

For the month of May, the president’s Total Job Approval Rating fell another point to 49% from 50% in April. That’s a continuing decline from a post-election high of 56% in December and puts his job approval rating back to where it was last September. However, it still remains a bit higher than it was for most of his first term in office.


The president put immigration reform and gun control at the top of his agenda this year, but fewer voters than ever (37%) now give him positive marks for his handling of both issues.

Voters continue to trust Republicans more than Democrats when it comes to handling the economy, the issue they consider most important to their vote. But for the first time since Election Day, the GOP has regained the trust advantage on the majority of major issues regularly tracked by Rasmussen Reports, including immigration and gun control.

When it comes to voter trust, Republicans are far and away the leaders on handling government spending, while Democrats lead by a similar margin on environmental issues.

Republicans and Democrats are now tied on the Generic Congressional Ballot. For the last seven weeks, the gap between the two parties has been two points or less.

As projected by the Rasmussen Employment Index, the latest report on unemployment and job creation represented a modest improvement over earlier months. Twenty-three percent (23%) of workers report their firms are hiring, while 19% report their employer is laying people off. That’s little changed from the previous month, but marks the sixth straight month that reported hirings have outnumbered reported layoffs. Overall, worker confidence in the labor market inched up in May to the highest level measured since October 2007. However, it is still well below the confidence measured during periods of solid economic growth.

But 80% think it will be at least somewhat difficult for young people to find summer jobs in the current economy. That includes 41% who say it will be Very Difficult.

The Rasmussen Consumer and Investor Indexes which measure daily confidence among those groups continue to run at or near their highest levels since before the Wall Street collapse in 2008.

However, just 45% of voters now view the U.S. economy as even somewhat fair. Fifty-one percent (51%) consider the economy to be unfair.

Similarly, 45% think the U.S. economy is at least somewhat fair to people who are willing to work hard. Forty-nine percent (49%) think it’s unfair to these people.

Sixty-eight percent (68%) of voters think an economic system that rewards hard work even if it leads to big differences between rich and poor is fairer than one that guarantees everyone equal income and equal wealth. Just 19% believe it’s fairer to insure equality of income and wealth.

After all, 90% believe it is important that people who are physically able to work are required to support themselves. Ninety percent (90%) also think it’s important to insure that everyone who is willing to work hard has a chance to earn a middle class lifestyle, while 89% believe it is important that people who work hard earn more than those who don’t.

Most working Americans (66%) still consider themselves part of the middle class. Those who feel they are among the working poor remain at an all-time low of eight percent (8%).

A plurality (37%) of workers plan on staying with their current company for at least five years, but when they do leave, 79% say it will be their decision.

Most voters believe the WikiLeaks release of classified documents is likely to have hurt U.S. national security, and 52% view Bradley Manning, the U.S. soldier accused of stealing the classified information for posting on the Internet, as a traitor who endangered lives and national security. Only 17% view Manning whose court-martial began this week as a heroic whistle-blower and political prisoner.

In other surveys last week:

– Most voters still view the president’s national health care law unfavorably and believe that free-market competition will do more than government regulation to bring health care prices down.

– Thirty-two percent (32%) of voters now say the country is heading in the right direction. That’s nearly identical to a year ago.

– Sixty percent (60%) think it’s at least somewhat likely that the United States could end its dependence on foreign oil by developing shale oil reserves. Fifty-one percent (51%) believe the United States has enough shale oil to become the world’s largest producer of oil and natural gas.

– Facebook has outlined new steps to identify and remove hate speech. But while 22% of Americans think hate speech is a Very Serious problem in the United States today, just 12% of Facebook users regard it as a Very Serious problem on the social networking site.

– Thirty-one percent (31%) favor a ban on hate speech. But 83% believe giving people the right to free speech is more important than making sure no one is offended by what others say.

– Forty-one percent (41%) of Americans are planning a summer vacation this year. Forty percent (40%) say economic conditions have caused them to cut back on the amount they will spend on this year’s vacation, but that’s down from 53% last year and the lowest finding in five years of surveying.

June 1 2013



Also posted as a blog at


Iranian Hackers Launching Cyber-Attacks on U.S. Energy Firms: Report

By Brian Prince | Posted 2013-05-27


Iranian hackers have amped up a campaign of cyber-attacks against America’s energy industry, according to a report from The Wall Street Journal.

Citing current and former U.S. officials speaking under the blanket of anonymity, the Journal reported that Iranian hackers accessed control system software that could have allowed them to manipulate oil or gas pipelines. The attacks raise the stakes in cyber-space between the U.S. and Iran, which has been accused by U.S. officials of being behind a spate of distributed denial-of-service attacks (DDoS) against U.S. banks stretching back to 2012.

“This is representative of stepped-up cyber activity by the Iranian regime. The more they do this, the more our concerns grow,” a source told the Journal. “What they have done so far has certainly been noticed, and they should be cautious.”

Alireza Miryousefi, Iran’s spokesperson at the United Nations, denied any connection between hackers and the regime in an interview with the Journal.

The officials who spoke to The Wall Street Journal did not name any of the energy companies targeted in the attacks. But two former officials said oil and gas companies located along the Canadian border were among those hit.

Word of the attacks comes a week after Charles Edwards, deputy inspector general at the U.S. Department of Homeland Security, told members of a Senate subcommittee that industrial control systems were increasingly coming under attack in cyber-space in ways that could potentially cause “large-scale power outages or man-made environmental disasters.”

Securing these systems is complicated, as many are more interconnected with the Internet than people realize, explained Tom Cross, director of security research at network security vendor Lancope.

“It is also difficult to fix security flaws with these systems because they aren’t designed to be patched and restarted frequently,” he said.

“It is extremely important,” he continued, “that operators of industrial control networks monitor those networks with systems that can identify anomalous activity that might be associated with an attack. Because of the relatively homogenous nature of network activity on many control systems networks, anomaly detection can be can be a powerful tool in an environment where other kinds of security approaches fall flat.”

Much of the talk about improving the security of critical infrastructure companies has focused on information sharing between the government and private sector. Improving communication between government and business figured prominently in the executive order on cyber-security that President Barack Obama issued in February. However, many officials and security experts have said that the order does not undo the need for legislation.

“The increases in cyber-assaults on our energy systems from Iranian-backed hackers are another signal to the government and the industry that measures must be taken to fortify the security of our critical infrastructure,” said Lila Kee, chief product and marketing officer at GlobalSign and a North American Energy Standards Board (NAESB) board member.

“However, there is a fine line between cyber-security regulation and voluntary standards,” she said. “Regulations cannot be so rigid so as to prevent protection from today’s evolving advanced persistent threats, and voluntary standards cannot be so loose so as to provide no purpose. In today’s modern world of malware, solutions must be fluid and scalable to battle aggressive cyber-attacks.”


Report: Chinese Hackers Accessed U.S. Weapon Designs

By Chloe Albanesius

May 28, 2013 12:40pm EST

Chinese hackers have accessed designs for U.S. advanced weapons systems, according to a new report from the Washington Post.

That includes everything from an Army system that can shoot down ballistic missiles to the $1.4 trillion F-35 Joint Strike Fighter (pictured), which was hacked in the past, the Post said.

The revelation was included in a report from the Defense Science Board, a committee of experts who advise the Defense Department on scientific and technical matters related to defense systems. The board released a public version of its report in January, with sensitive information removed. The Washington Post recently obtained the full version of that report, which included a list of compromised weapons designs.

The board did not come out and say that China stole U.S. designs, but officials told the Post that it speaks to China’s increasing push to access defense-related data from the U.S.

The report is the latest in a string of cyber attacks reportedly perpetrated by Chinese hackers. In a report released earlier this month, the DOD said the People’s Liberation Army was collecting information from U.S. diplomatic, economic, and defense sectors in order to gain insight into how the U.S. views China, among other things. “In 2012, numerous computer systems around the world, including those owned by the U.S. government, continued to be targeted for intrusions, some of which appear to be attributable directly to the Chinese government and military,” the DOD said.

In April, meanwhile, Verizon said in its annual Data Breach Investigations Report that approximately 96 percent of the 2012 cyber-espionage cases traced to China.

A month earlier, President Obama’s national security advisor urged the Chinese to stop hacking U.S. targets and “establish acceptable norms of behavior” for cyberspace. In a speech to The Asia Society, Tom Donilon said cyber-security issues had “become a key point of concern and discussion with China at all levels of our governments.”

Thus far, China has denied any wrongdoing and said accusations of hacking U.S. targets are unprofessional. Officials also accused the U.S. government of doing the same thing to Chinese targets. In conjuction with Israel, the U.S. was said to be behind the spread of the Stuxnet virus in Iran.

According to the Post, President Obama will meet with Chinese President Xi Jinping next month in California, where cyber attacks will be on the agenda.

In addition to defense targets, Chinese hackers have been accused of attacking U.S. media outlets like the New York Times, as well as private companies like Google.

The Defense Science Board report, meanwhile, comes amidst a report from the Australian Broadcasting Corp., which claims that hackers targeted government agencies and major corporations in the country. The “digital trail leads to China,” ABC said, though it’s unclear if the hackers are working for the Chinese government. As noted by USA Today, Australian officials have declined to comment on whether the hacks are linked to China.


Preventing a U.S.-China Cyberwar

NY Times


Published: May 25, 2013


When President Obama and President Xi Jinping of China have their first meeting next month in California, addressing the issue of China’s cyberattacks on American institutions will be an important priority. Both nations need to take steps to avoid drifting into an all-out cyberwar.

Despite Beijing’s denials, there is little doubt that Chinese hackers have taken aim at a range of government and private systems in the United States, including the power grid and telecommunications networks. In February, a report by the computer security firm Mandiant detailed how hackers working for the People’s Liberation Army of China had gained access to data from American companies and government agencies. Earlier this month, a Pentagon report explicitly accused the Chinese military of the attacks.

With the evidence of their activities mounting, Chinese hackers went silent for three months, but, they now seem to have resumed their attacks. A report last week by the Commission on the Theft of American Intellectual Property, a private group led by two former Obama administration officials — Dennis Blair, who was the director of intelligence, and Jon Huntsman Jr., an ambassador to China — said that hacking costs the American economy more than $300 billion a year and that China was responsible for 70 percent of the theft of corporate intellectual property and trade secrets.

While there are concerns about military-related incursions, the focus of most public discussion surrounds hacking into business and industry. The commission’s report spoke of the risk of “stifling innovation” in America and elsewhere if hackers in China are able to steal blueprints and negotiation strategies. The Chinese complain that they, too, have suffered cyberattacks. That could offer some basis for cooperating with Washington on norms of behavior. China recently agreed to an Obama administration proposal to create a working group on cyberissues.

The commission said the American response was “utterly inadequate” and proposed stronger ways to deter Chinese hacking, like possibly allowing companies to retaliate against attackers with their own counterstrikes.

But before adopting punitive measures, the two nations need to try working together. For example, the EastWest Institute, an independent research group, is working with representatives of many governments, including China and the United States, to develop ground rules for protecting the digital infrastructure. The group’s detailed proposal on fighting spam — which carries malware used by hackers — is worth considering by President Obama and President Xi.



China Doesn’t Care if Its ‘Digitalized’ Military Cyberwar Drill Scares You


By Alexander Abad-Santos

May 29, 2013


In the face of fears from President Obama to the Pentagon and across the globe about the increasing military might behind Chinese hacking, China’s state news agency announced Wednesday that the nation’s People’s Liberation Army “will conduct an exercise next month to test new types of combat forces including units using digital technology amid efforts to adjust to informationalized war.” You know, right after Obama and Chinese President Xi Jinping are meeting about the state of, among other things, a cyberwar going on underneath their noses.

The new report from Xinhua news agency adds that the drill, taking place on a large military training field and not some underground hack-a-thon bunker, “will be the first time a PLA exercise has focused on combat forces including digitalized units, special operations forces, army aviation and electronic counter forces.” The terms “digitalized unit” and “electronic counter forces” don’t make it at all clear what China’s military has planned, but then again, no country is ever going to reveal its full cyberwarfare capabilities in detail — and it’s not like it’s the first time the Chinese have tested the military reaches of their digital warfare capabilities. Indeed, the U.S. was game to play along on more than one occasion. Last year it was reported that the U.S. and China had secretly engaged in at least two cyber war games in 2011, “designed to help prevent a sudden military escalation between the two superpowers if either felt they were being targeted,” as ZDNet’s Emil Potralinski reported. “In the first, both sides had to describe what they would do if they were attacked by a sophisticated computer virus, such as Stuxnet. In the second, they had to describe their reaction if the attack was known to have been launched from the other side.”


China Is Winning the Cyber War Because They Hacked U.S. Plans for Real War

by Alexander Abad-Santos

May 28, 2013

Ballistic-missile defenses, joint-strike fighters, Black Hawks, and more — Chinese hackers have their hands on plans for these and more of the Pentagon’s most sophisticated weapons systems, just the latest sign that the culture of hacking in China continues to put America on the defensive ahead of a tense meeting between President Obama and Xi Jinping, a summit bound to be tense with cyberwarfare diplomacy.

The Washington Post‘s Ellen Nakashima reports in Tuesday’s paper that Chinese cyberthieves have “compromised” mockups that form the “backbone” of some of the U.S. military’s most important and high-tech defense technology, and that it could signal a copycat advancement of China’s arms, while aiming to “weaken the U.S. military advantage” down the road. The Chinese government, as usual with these attacks — even when they seem connected directly to the People’s Liberation Army — are distancing themselves from the pervasive, and this time very internationally unsound, hacking. “The Defense Science Board, a senior advisory group made up of government and civilian experts, did not accuse the Chinese of stealing the designs. But senior military and industry officials with knowledge of the breaches said the vast majority were part of a widening Chinese campaign of espionage against U.S. defense contractors and government agencies,” the Post reports.

The new breach comes as a newly disclosed part of a classified Defense Science Board report. Back in January, the board released a public version of the report, warning of possible attacks on U.S. defense systems as well as the Defense Department’s lack of preparation and protection. And if you look back in 2005, the same group warned U.S. defense officials against buying microchips from China because of trojan horses and spyware — advice the Pentagon eventually took, cutting off Chinese supply in 2011. But in just the last few months, Chinese hackers have gotten to major U.S. news organizations and government agencies. How have the Pentagon’s own cybersecurity experts been so far ahead of the Pentagon’s actual cybersecurity if China is stealing our war plans — or at least our warplanes? And is there any way to stop it?

Read more on The Atlantic Wire



Government Executive Nuclear Arsenal Subject to Pentagon Cuts, But New Subs May Escape Ax

By Elaine M. Grossman

May 24, 2013


The U.S. nuclear arsenal might be subject to cutbacks by a major budget review under way at the Defense Department, despite enjoying relative protection this year from largely across-the-board sequester spending reductions, a senior Defense official said on Thursday.

“Every part of the program, including nuclear weapons, is being addressed,” the official said in an interview, referring to the ongoing Strategic Choices and Management Review led by Deputy Defense Secretary Ashton Carter.

The budget scrub is to advise Chuck Hagel, the defense secretary, by late this month on how best to apportion $500 billion in congressionally mandated funding reductions over the next decade. If President Obama can convince lawmakers to repeal the 2011 Budget Control Act, lesser but still-substantial cuts would likely be taken in 2014 and beyond.

The senior official — who requested anonymity in this article to address politically and diplomatically sensitive topics — appeared to suggest, though, that the Pentagon intends to keep ballistic missile-armed submarines relatively safe from the cost-cutting ax.

The big-ticket item coming down the pike for modernizing the Navy’s aging “boomer” submarines and their Trident D-5 ballistic missiles is the estimated $90 billion Ohio-class replacement vessel, also dubbed “SSBN(X).”

“For SSBN(X), I don’t see viable alternatives to going forward with the program,” said the Defense leader, noting the Pentagon had already “made some significant adjustments” to program costs by delaying fielding of the first vessel by two years to 2031. “It’s the most important element — it’s the central element — of our triad.”

That could leave the other two legs of the nuclear delivery arsenal — Air Force bomber aircraft and ICBMs — on the hot seat for reductions.

The service intends to field 80 to 100 new, conventionally armed Long-Range Strike bombers after 2020 that would