Skip to content

Jun 15 2013

June 18, 2013

June 15, 2013



Nine Companies Tied to PRISM, Obama Will Be Smacked With Class-Action Lawsuit Wednesday

AOL, Apple, Facebook, Google, Microsoft, PalTalk, Skype, Yahoo! and Youtube will be named in the suit, attorney says

US News

By Steven Nelson

June 11, 2013


Former Justice Department prosecutor Larry Klayman amended an existing lawsuit against Verizon and a slew of Obama administration officials Monday to make it the first class-action lawsuit in response to the publication of a secret court order instructing Verizon to hand over the phone records of millions of American customers on an “ongoing, daily basis.”

Klayman told U.S. News he will file a second class-action lawsuit Wednesday in the U.S. District Court for the District of Columbia targeting government officials and each of the nine companies listed in a leaked National Security Agency slideshow as participants in the government’s PRISM program.

According to the slideshow, the PRISM program allows government agents direct, real-time access to the servers of nine major tech companies, including AOL, Apple, Facebook, Google, Microsoft, PalTalk, Skype, Yahoo! and YouTube.

U.S. News did not seek comment from the companies, all of which have denied any knowledge of or participation in the PRISM program.

Klayman said he hopes the two lawsuits will be considered jointly as companion cases.

The class-action lawsuit against Verizon says the defendants violated customers’ “reasonable expectation of privacy, free speech and association, right to be free of unreasonable searches and seizures and due process rights.”

“This case challenges the legality of Defendants’ participation and conduct in a secret and illegal government scheme to intercept and analyze vast quantities of domestic telephone communications,” says the lawsuit against Verizon, which also names as defendants President Barack Obama, Attorney General Eric Holder, NSA director Keith Alexander and federal judge Roger Vinson, the FISA court judge who approved the leaked April order.

Klayman told U.S. News the Verizon lawsuit will be served to the named defendants. “Either they will file an answer or move to dismiss” the complaint, Klayman said, “[but] we’re confident the case will proceed.”

Within a few months, Klayman said, the court will likely define the “class” the suit seeks to represent by ruling that “everyone’s in” or by allowing Verizon customers to either opt-in or out-out of the class. Currently the suit only represents a Pennsylvania couple.

Sen. Rand Paul, R-Ky., said during a Sunday appearance on Fox News that he intends to file a class-action lawsuit against Verizon and the Internet companies over the surveillance programs, but it’s unclear how exactly he will proceed with the plans.

“I’m going to be asking all the Internet providers and all of the phone companies: Ask your customers to join me in a class action lawsuit,” Paul said Sunday. “If we get 10 million Americans saying we don’t want our phone records looked at, then maybe someone will wake up and something will change in Washington.”

Paul “effectively endorsed our case,” Klayman said, speculating that the senator “must have known about it.” Although a copy of the Verizon lawsuit was sent to Paul’s office, the attorney said he received no response. Klayman founded the pro-transparency legal group Judicial Watch in 1994 and currently leads an organization called Freedom Watch.

Documents exposing the Verizon phone-record collection order – which Sen. Diane Feinstein, D-Calif., said was a routine reauthorization of a seven-year-old practice – and the NSA’s PRISM program were handed over to the Guardian and Washington Post by former Booz Allen Hamilton employee Edward Snowden. He traveled to Hong Kong before leaking the documents and voluntarily revealed his identity Sunday.



Defending NSA Prism’s Big Data Tools

The more you know about NSA’s Accumulo system and graph analysis, the less likely you are to suspect Prism is a privacy-invading fishing expedition.

By Doug Henschen

Executive Editor, InformationWeek

June 11, 2013 02:26 PM


It’s understandable that democracy-loving citizens everywhere are outraged by the idea that the U.S. Government has back-door access to digital details surrounding email messages, phone conversations, video chats, social networks and more on the servers of mainstream service providers including Microsoft, Google, Yahoo, Facebook, YouTube, Skype and Apple.

But the more you know about the technologies being used by the National Security Agency (NSA), the agency behind the controversial Prism program revealed last week by whistleblower Edward Snowden, the less likely you are to view the project as a ham-fisted effort that’s “trading a cherished American value for an unproven theory,” as one opinion piece contrasted personal privacy with big data analysis.

The centerpiece of the NSA’s data-processing capability is Accumulo, a highly distributed, massively parallel processing key/value store capable of analyzing structured and unstructured data. Accumolo is based on Google’s BigTable data model, but NSA came up with a cell-level security feature that makes it possible to set access controls on individual bits of data. Without that capability, valuable information might remain out of reach to intelligence analysts who would otherwise have to wait for sanitized data sets scrubbed of personally identifiable information.

As InformationWeek reported last September, the NSA has shared Accumulo with the Apache Foundation, and the technology has since been commercialized by Sqrrl, a startup launched by six former NSA employees joined with former White House cybersecurity strategy director (and now Sqrrl CE0) Ely Khan.


“The reason NSA built Accumulo and didn’t go with another open source project, like HBase or Cassandra, is that they needed a platform where they could tag every single piece of data with a security label that dictates how people can access that data and who can access that data,” said Khan in an interview with InformationWeek.

Having left government employment in 2010, Kahn says he has no knowledge of the Prism program and what information the NSA might be collecting, but he notes that Accumulo makes it possible to interrogate certain details while blocking access to personally identifiable information. This capability is likely among the things James R. Clapper, the U.S. director of National Intelligence, was referring to in a statement on the Prism disclosure that mentioned “numerous safeguards that protect privacy and civil liberties.”


Are They Catching Bad Guys?

So the NSA can investigate data with limits, but what good is partial information? One of Accumulo’s strengths is finding connections among seemingly unrelated information. “By bringing data sets together, [Accumulo] allowed us to see things in the data that we didn’t necessarily see from looking at the data from one point or another,” Dave Hurry, head of NSA’s computer science research section, told InformationWeek last fall. Accumulo gives NSA the ability “to take data and to stretch it in new ways so that you can find out how to associate it with another piece of data and find those threats.”

The power of this capability is finding patterns in seemingly innocuous public network data — which is how one might describe the data accessed through the Prism program — yet those patterns might somehow correlate with, say, a database of known terrorists or data on known cyber warfare initiatives.

Sqrrl has supplemented the Accumulo technology with analytical tools including SQL interfaces, statistical analytics interfaces, text search and graph search engines, and there’s little doubt the NSA has done the same, according to Kahn. Graph search, in particular, is a powerful tool for investigation, as the NSA itself revealed last month when it shared at a Carnegie Mellon technical conference an in-depth presentation on the 4.4-trillion-node graph database it’s running on top of Accumulo.

Nodes are essentially bits of information — phone numbers, numbers called, locations — and the relationships between those nodes are edges. NSA’s graph uncovered 70.4 trillion edges among those 4.4 trillion nodes. That’s truly an ocean of information, but just as Facebook’s graph database can help you track down a long-lost high school classmate within seconds, security-oriented graph databases can quickly spot threats.

Kahn says a Sqrrl partner company that does graph analysis of internal network activity for security purposes recently identified suspicious activity using a graph algorithm. “Five days later, they got a knock on the door from the FBI letting them know that data was being exfiltrated from their network, likely by a foreign entity,” Kahn reports.

As we’ve reported, graph database technology dates back to the 1950s, but only recently has it advanced to truly big data scale, with Facebook exposing its Graph Search capabilities in January and NSA sharing details of its graph search capabilities last month.

Where prior intelligence techniques have largely been based on knowing patterns and then alerting authorities when those patterns are detected, security and intelligence analysts now rely on big data to provide more powerful capabilities than analytics alone.

“Graph analysis is just one really good technique for finding unknown patterns in data,” Kahn explains.


Do You Trust The Government?

In the end, assurances from Clapper, a former White House employee like Khan or even President Barak Obama may do little to assuage the concerns of privacy hawks, critics inside government or large swaths of American citizens. But those who known the technology used by the NSA know that Prism is not a simplistic, “collect first, ask questions later” expedition, and it’s not based on an “unproven theory.”

It’s likely no coincidence that suppliers of data to Prism such as Google have also been able to uncover espionage directed by foreign governments. In fact, a bipartisan Commission on the Theft of American Intellectual Property last month recommended increasing the budget of the FBI and Department of Justice to investigate trade theft, amending U.S. counter-espionage laws, and encouraging U.S. businesses to take the information security threat more seriously.

One government insider informs InformationWeek that he knows with certainty that “semantic and visual analytics tools have prevented multiple acts of terrorism.” That insight predates recent advances in graph analysis that are undoubtedly giving the U.S. Government even more powerful tools. Privacy concerns and the desire for checks on government access to private information must be considered, but we can’t naively turn a blind eye to very real threats by not making the most of advanced big data intelligence tools now at our disposal.


Snowden’s Security Breach: Is He Really the Only One?

By DAVID FRANCIS, The Fiscal Times

June 12, 2013


Three days after National Security Agency leaker Edward Snowden went public, the mystery around him and the information he illegally gave to reporters shows no signs of abating.

Snowden has simply vanished. He checked out of his Hong Kong hotel Monday and no one has seen him since. Whether he’s been nabbed by authorities or is on the run is anyone’s guess.

But the bigger mystery centers on what else Snowden knows. It’s not clear if he has more top-secret documents in his possession on top of the slideshow that revealed the PRISM program and a copy of an order from the Foreign Intelligence Surveillance Court.

Are there other documents that he’s in possession of as well? And if he does have other documents, what do they contain? The information in the slide show he leaked may be so sensitive that The Washington Post decided to run only four of 41 slides. What was on the other 37?

According to reports, despite top-level security clearance, Snowden has documents that he should not have had access to. For instance, administration officials questioned how he could have accessed the FISA court order, a document that should have been off-limits to someone in his position as an NSA contractor. Former NSA inspector general Joe Brenner suggested that Snowden might have used his IT position to give himself administrator access across multiple databases of classified materials.

“So many things are classified and so many things are top secret,” William Hartung, director of the Arms and Security Project at the Center for International Policy, told The Fiscal Times. “It can be some amazingly compromising information.”

This information includes an 18-page order from President Obama ordering the NSA and CIA to draw up a list of cyber warfare targets. Snowden said he leaked the cyber document as well.

Cyber directives and an NSA surveillance PowerPoint are likely the tip of the classified iceberg. Snowden might have had access to nuclear weapons plans or infrastructure vulnerability reports. He might have documents revealing the location of missile sites, or lists of CIA officers. He’s claimed as much, telling Greenwald he had “full access to the rosters of everyone working at the NSA, the entire intelligence community, and undercover assets all around the world, the locations of every station we have, what their missions are and so forth.”



Administration and intelligence officials are downplaying these claims, telling the Washington Post that they doubt his claims of widespread access. But if Snowden does have more sensitive information he has yet to leak, he could join other defense contractors who stole documents and changed history.

While working as a contractor in the Netherlands for the British nuclear power company the URENCO Group in the 1970s, Pakistani nuclear scientist A.Q. Khan began stealing designs for nuclear centrifuges. He subsequently sent them to Pakistan, giving the country the ability to build nuclear weapons.

When Dutch intelligence began to investigate, Khan fled to Pakistan and oversaw the construction of Pakistan’s military arsenal. Pakistan subsequently sold nuclear weapons plans to North Korea, Libya and Iran, drastically shifting the balance of world power (the Libyan nuclear weapons program was dismantled in 2003).



Even if Snowden did not have access to nuclear plans, it’s likely that someone like him – a civilian with security clearance – does. There are 500,000 private contractors with top-secret access. When government employees are included, that number swells to 1.4 million with security clearance, according to The Washington Post.

Steven Aftergood, director of the Federation of American Scientists’ Project on Government Secrecy and an expert on security clearance, says that gaining the necessary clearance is not as difficult as one might think.

“It’s a process of vetting either government employees of contractors to ensure that they are trustworthy and loyal to the United States and therefore eligible to have access to classified info,” he told The Fiscal Times. “There are disqualifying factors: heavily in debt, an alcohol or drug problem, or a history of sexual misconduct of criminal behavior. Otherwise, chances are you can get it.”

But as Snowden illustrates, this does not always work. According to reports, the rush to fill security clearance jobs has diluted the hiring and vetting process, making it possible for people to slip through the cracks.



Right now, Snowden has few good options. It’s possible he’s already been captured. It’s possible that he’s on the run. Without a fake passport, he can’t leave China: The electronic dragnet he felt obligated to expose would identify him immediately.

This backs him into a corner. He could ask for asylum at an embassy of a country that does not have an extradition treaty with the United States. He could take refuge at the Russian embassy, a country that has already offered him asylum and is very interested in the documents he has. He could also approach the Chinese government to try to trade the secrets he has for a safe home there.


Colluding with the Russians or the Chinese would end the debate about his status as a whistleblower. It would make him a spy. But maybe Snowden has a fake identity and is already off the grid.

At this point in the mystery, it wouldn’t be a surprise if the high school dropout were smarter than us all.

Three days after National Security Agency leaker Edward Snowden went public, the mystery around him and the information he illegally gave to reporters shows no signs of abating.

Snowden has simply vanished. He checked out of his Hong Kong hotel Monday and no one has seen him since. Whether he’s been nabbed by authorities or is on the run is anyone’s guess.

But the bigger mystery centers on what else Snowden knows. It’s not clear if he has more top-secret documents in his possession on top of the slideshow that revealed the PRISM program and a copy of an order from the Foreign Intelligence Surveillance Court.

Are there other documents that he’s in possession of as well? And if he does have other documents, what do they contain? The information in the slide show he leaked may be so sensitive that The Washington Post decided to run only four of 41 slides. What was on the other 37?

According to reports, despite top-level security clearance, Snowden has documents that he should not have had access to. For instance, administration officials questioned how he could have accessed the FISA court order, a document that should have been off-limits to someone in his position as an NSA contractor. Former NSA inspector general Joe Brenner suggested that Snowden might have used his IT position to give himself administrator access across multiple databases of classified materials.

“So many things are classified and so many things are top secret,” William Hartung, director of the Arms and Security Project at the Center for International Policy, told The Fiscal Times. “It can be some amazingly compromising information.”

This information includes an 18-page order from President Obama ordering the NSA and CIA to draw up a list of cyber warfare targets. Snowden said he leaked the cyber document as well.

Cyber directives and an NSA surveillance PowerPoint are likely the tip of the classified iceberg. Snowden might have had access to nuclear weapons plans or infrastructure vulnerability reports. He might have documents revealing the location of missile sites, or lists of CIA officers. He’s claimed as much, telling Greenwald he had “full access to the rosters of everyone working at the NSA, the entire intelligence community, and undercover assets all around the world, the locations of every station we have, what their missions are and so forth.”



Administration and intelligence officials are downplaying these claims, telling the Washington Post that they doubt his claims of widespread access. But if Snowden does have more sensitive information he has yet to leak, he could join other defense contractors who stole documents and changed history.

While working as a contractor in the Netherlands for the British nuclear power company the URENCO Group in the 1970s, Pakistani nuclear scientist A.Q. Khan began stealing designs for nuclear centrifuges. He subsequently sent them to Pakistan, giving the country the ability to build nuclear weapons.

When Dutch intelligence began to investigate, Khan fled to Pakistan and oversaw the construction of Pakistan’s military arsenal. Pakistan subsequently sold nuclear weapons plans to North Korea, Libya and Iran, drastically shifting the balance of world power (the Libyan nuclear weapons program was dismantled in 2003).



Even if Snowden did not have access to nuclear plans, it’s likely that someone like him – a civilian with security clearance – does. There are 500,000 private contractors with top-secret access. When government employees are included, that number swells to 1.4 million with security clearance, according to The Washington Post.

Steven Aftergood, director of the Federation of American Scientists’ Project on Government Secrecy and an expert on security clearance, says that gaining the necessary clearance is not as difficult as one might think.

“It’s a process of vetting either government employees of contractors to ensure that they are trustworthy and loyal to the United States and therefore eligible to have access to classified info,” he told The Fiscal Times. “There are disqualifying factors: heavily in debt, an alcohol or drug problem, or a history of sexual misconduct of criminal behavior. Otherwise, chances are you can get it.”

But as Snowden illustrates, this does not always work. According to reports, the rush to fill security clearance jobs has diluted the hiring and vetting process, making it possible for people to slip through the cracks.



Right now, Snowden has few good options. It’s possible he’s already been captured. It’s possible that he’s on the run. Without a fake passport, he can’t leave China: The electronic dragnet he felt obligated to expose would identify him immediately.

This backs him into a corner. He could ask for asylum at an embassy of a country that does not have an extradition treaty with the United States. He could take refuge at the Russian embassy, a country that has already offered him asylum and is very interested in the documents he has. He could also approach the Chinese government to try to trade the secrets he has for a safe home there.

Colluding with the Russians or the Chinese would end the debate about his status as a whistleblower. It would make him a spy. But maybe Snowden has a fake identity and is already off the grid.

At this point in the mystery, it wouldn’t be a surprise if the high school dropout were smarter than us all.



Opening Summary — SAC-D (Budget Request)

06/11/2013 02:29 PM CDT

As Delivered by Secretary of Defense Chuck Hagel, Washington D.C., Tuesday, June 11, 2013


Chairman Durbin, Ranking Member Cochran, distinguished members of the subcommittee, thank you for the opportunity to discuss the President’s F.Y. 2014 budget request for the Department of Defense.

I also appreciate this subcommittee’s continued support of our men and women in uniform and our civilian work force and their families. As we discuss numbers, budgets, and strategic priorities, we will not lose sight of these men and women serving across the globe. As you all know, their well-being depends on the decisions we make here in Washington.

The President has requested $526.6 billion for the Department of Defense’s F.Y. 2014 base budget and $79.4 billion for overseas contingency operations (OCO).

My written statement, Mr. Chairman, as you know, contains details on both budget requests. This morning, allow me to, very briefly, focus on three areas before I take your questions:

• first, the continued budget challenges facing the Department in F.Y. 2013 as a result of sequestration, as you have noted as well as Senator Cochran;

• second, the Department’s F.Y. 2014 budget request;

• and third, how the Department is preparing for the future budget uncertainty and the prospect of further reduced resources as a result of sequestration.

As you all know, the Department has been forced to implement deep and abrupt cuts in the current fiscal year because of sequestration. According to the latest guidance from the Office of Management and Budget, the Department must cut $37 billion in spending through the remainder of this fiscal year.

With our internal decision to shift the impact of sequestration away from those serving in harm’s way and force readiness, the cuts fall heavily on DOD’s accounts that train and equip those who will deploy in the future. The Department is also experiencing higher wartime costs than expected. As a result of these factors, the Department is facing a shortfall of more than $30 billion in our operation and maintenance (O&M) account for F.Y. 2013.

To deal with the shortfall, the Department has cut back sharply on facilities maintenance, instituted hiring freezes, cut overhead spending, reduced important, but lower priority programs, directed furloughs of nearly 700,000 civilian employees, and submitted, as you have noted, a $9.6 billion reprogramming request to Congress. And we ask this subcommittee for your assistance in providing rapid review and approval of this critical reprogramming request.

Given the scale of this shortfall, the reprogramming and other steps we’ve taken to cut nonessential spending are not enough. While we have protected spending to sustain the war effort and defend America’s vital strategic interests, the Department’s day-to-day activities will be significantly disrupted for the remainder of this fiscal year.

Each of the military services has begun to significantly reduce training and maintenance of non-deployable operating forces. As you have both noted, you have had the Chiefs before this committee, and they have made some very significant detailed presentations and accounted for these cuts. For example, the Army has stopped rotations at its key combat training centers for all but deploying units. More than a dozen combat coded Air Force squadrons either, already have, or will soon stop flying, and the Navy has curtailed many deployments.

To avoid even more significant reductions to military readiness, I directed furloughs of up to 11 days for most of the Department’s 800,000 civilian personnel. Mr. Chairman, I made this decision very reluctantly. I made it decision reluctantly, because I recognize the significant hardship this places on civilian personnel across our country and especially on their families. But the current budget environment is requiring difficult decisions and options.

Now, let me turn to F.Y. 2014. The President’s F.Y. 2014 budget continues to implement the $487 billion in spending reductions over the next 10 years agreed to in the Budget Control Act of 2011. If the sequester-related provisions of the Budget Control Act are not changed, F.Y. 2014 funding for national defense programs will be subject to an additional $52 billion reduction in DOD funding. And if there are no changes, continued sequestrations will result in roughly $500 billion in additional reductions to defense spending over the next 10 years.

The President’s F.Y. 2014 budget replaces sequestration and gives the Department the time and flexibility to plan and implement spending reductions wisely and responsibly.

In particular, this budget enables the Department to support troops still at war in Afghanistan, protect readiness, modernize the military’s aging weapons inventory in keeping with the President’s strategic guidance and sustain the high quality of the all-volunteer force that you noted that was in General Dempsey’s speech at the National Press Club.

This budget also continues the Department’s approach of the last couple of years of targeting growing costs in support areas like overhead, acquisition and pay and benefits.

Over the next five years, DOD has identified $34 billion in new savings across these categories. This includes weapons program restructuring and terminations that will achieve $8.2 billion in savings, slowdowns in military construction and reductions in other lower priority programs. Our military compensation package preserves DOD’s world-class pay and benefits while putting our military on a more sustainable path to the future. It includes changes to the TRICARE program to bring the beneficiaries’ cost share closer to the levels envisioned when the program was first implemented.

The Department of Defense also must be able to eliminate excess infrastructure. The President’s F.Y. 2014 budget requests authorizations for one round of Base Realignment and Closure (BRAC) in 2015. BRAC, as we all recognize, is an imperfect process. And there are upfront costs. But in the long term, there are significant savings. The previous rounds of BRAC are saving $12 billion annually. We cannot justify to continue funding unnecessary infrastructure when we are reducing our force structure. Since 2003, DOD has divested more than 100 foreign bases and operations, and we are on schedule to close or consolidate over 20 more overseas operations.


Although there are clearly opportunities to achieve significant savings by improving efficiency, consolidations and reducing overhead, the scale of the current spending reductions will also require cuts and changes to military operations.

The fiscal 2014 budget request seeks to further align budget programs with the budgets of the President’s defense strategic guidance. While continuing to reduce the size of the ground forces and retire aging aircraft and ships, this budget invests in key elements of our defense strategy, including implementing our rebalance to the Asia-Pacific region, maintaining a safe, secure and effective nuclear stockpile, increasing investment in cyber-capabilities and sustaining the growth of special operations forces.

Finally, this budget seeks to preserve a combat-ready force and sustain the high-quality, all-volunteer force.

Last point, the F.Y. 2014 budget reflects DOD’s best efforts to match ends, ways and means during a period of very intense fiscal uncertainty. It is obvious that significant changes to the Department’s top line spending would require changes to this budget plan. Consequently, I directed, as you have noted, a Strategic Choices and Management Review in order to assess the potential impact of further reductions and plan for those continued reductions. I have received the initial internal results of this review and I am currently reviewing those options and those choices.

The Defense Department will continue to find new ways to operate more affordably, efficiently and effectively. However, as I’ve stated, continued cuts on the scale and the timeline of sequestration will require significant reductions in military capabilities and the scope of our activities around the world.

The President’s F.Y. 2014 budget sustains our military strength in an environment of constrained resources, giving DOD the time and the flexibility to make the necessary reductions and adjustments over a ten year period.

Hard choices, Mr. Chairman, will have to be made over these next few years. In the past, many modest reforms to personnel and benefits, along with efforts to reduce infrastructure and restructure acquisition programs were met with fierce political resistance, and they were never implemented. We are now in a different fiscal environment. New realities are forcing us to more fully confront these tough and painful choices. And to make the reforms necessary to put this Department on a path to sustain our military strength for the 21st century and meet these new and complicated threats we will have to do things differently. This will require the continued partnership of Congress.

Thank you.




Springfield to Get Drone Headquarters

Andrew McGinn

Source: Dayton Daily News, Ohio

Created: June 12, 2013


June 12–SPRINGFIELD — An office on Springfield’s eastern edge will serve as the base of operations for Ohio and Indiana’s joint effort to become a test site for unmanned aerial systems, an industry that’s predicted to create thousands of new jobs.

The Ohio/Indiana Unmanned Aerial Systems Center and Test Complex will be housed in leased office space at the Nextedge Applied Research and Technology Park along U.S. 40 in a building owned by Advanced Virtual Engine Test Cell, better known as Avetec.

The Ohio Department of Transportation, which will manage the complex, picked the space from among 15 possible sites, 12 of which were located in Montgomery and Greene counties.

The state controlling board Monday approved ODOT’s two-year lease at a total cost of $70,000 that will run from July 1 until June 30, 2015.

It’s hoped the creation of the test complex will bolster the two-state effort to win one of six UAS test sites being created nationally later this year by the Federal Aviation Administration. The FAA will use the sites to gather safety and privacy data needed before drones are allowed into manned airspace in 2015.

An industry report in March predicted that unmanned aircraft will create more than 2,700 new Ohio jobs by 2025 and generate $2.1 billion in development. That same report by the Association for Unmanned Vehicle Systems International stated the selection of the test sites will help determine where jobs flow.

Two dozen applicants representing 23 states are in the running for the six test sites, said Joe Zeis, executive vice president and chief strategic officer for the Dayton Development Coalition.

Other states have upped their efforts.

North Dakota recently created the Northern Plains UAS Test Site, which will be led by a former Air National Guard wing commander, with $1 million to pursue an FAA test-site designation. The governor there last month appropriated $4 million more if the FAA picks his state.


“The stakes are pretty high,” Rob Nichols, press secretary for Ohio Gov. John Kasich, said Tuesday, preferring not to say what, exactly, the test complex here will do for the Ohio-Indiana application.

The 2,060 square feet of leased space would serve not only as a base of operations for the flight testing of unmanned aircraft in Ohio and Indiana but also as a site for data analysis, modeling, simulations and mission planning, according to a news release from state Sen. Chris Widener, R-Springfield.

The not-for-profit Avetec already has modeling and simulation capabilities. The organization strives to reduce the cost and time it might take the military to develop and test jet engines by doing it virtually.

“It’s a coup on one hand, but it more reflects the positive things Springfield is bringing to the table for UAVs in this region,” Tom Franzen, assistant city manager and director of economic development, said.

Even if the FAA doesn’t pick Ohio and Indiana, drone testing will be done in the region, Nichols said, because of a NASA challenge grant for sense-and-avoid technology.

With the presence of Wright-Patterson Air Force Base, the Dayton-Springfield region has long been viewed as the state’s bastion of UAS research and development. The availability of restricted airspace in neighboring Indiana makes for a “wonderfully complementary partnership,” Zeis said.

Aside from a smattering of businesses locally already involved in UAS, Springfield’s other chief contribution to that partnership is the Springfield-Beckley Municipal Airport, where drones will be launched and recovered on their way to and from restricted airspace. Unmanned aircraft also will be flown from the Wilmington Air Park.

Avetec — and Nextedge in general — has been connected via a fiber-optic ring since 2010 to the Springfield airport and to Wright-Patterson, a feature that factored into the site selection.

“That site adds a tremendous amount of value to the state’s efforts, not just ours,” Franzen said.

The selection also renews hope for the 205-acre local technology park that had stagnated so much during the recession that the bank eventually auctioned it last year for a fraction of the nearly $15 million invested in it. That investment included $6.3 million in public funding for the main infrastructure.

Developed in 2004, when “nobody even knew what a UAV was,” Franzen said, Nextedge might finally get to live up to its name by attracting drone developers.

“This will be a nice reminder to folks that we have this amenity in the region,” Franzen said.




IEEE Spectrum

Farming by the Numbers

Precision agriculture brings computing’s accuracy to the ancient art of cultivation

By Ariel Bleicher

Posted 30 May 2013 | 20:44 GMT


One bone-chilling September morning, Clay Mitchell drives his combine harvester through a field of amber Iowa corn, elsewhere known as maize. Beneath him, the monstrous machine reaps and threshes the plants, winnows the grain from the chaff, and then blows the yellow kernels into a holding tank, where they pile up in mounds as lustrous as a leprechaun’s gold.

Thanks to this mechanization as well as to advances in seed genetics and chemical fertilizers, Mitchell and his neighbors can now wrest higher yields than any generation before them. They are some of the most successful corn producers in the United States, which grows about 40 percent of the world’s corn supply. But the big machines that extract this bounty have taken a toll on the land. Decades of tillage and compaction have weakened the soil, making it prone to heavy erosion from wind and rain.

Agronomists estimate that since large-scale farming began here about 150 years ago, Iowa has lost roughly half its topsoil. In the mid-19th century, the average farmer could dig a hole in his field and stick his arm in as deep as the elbow before he felt something other than black, nutrient-rich earth. Today, he’s lucky if he gets as far as his wrist. Recent measurements show that in many parts of the state, topsoil is washing away 10 to 50 times as fast as it can re-form. There is evidence that such rapid degradation is already depressing yields.

What’s happening in Iowa isn’t unique. Studies suggest that every year, one hectare of farmland anywhere on Earth loses on average 30 metric tons of soil—enough to fill a large shipping container. In some places, erosion has degraded farmland so severely that it has been abandoned. As the world’s population rises, farmers are faced with having to grow more crops on less soil. “Globally, we are on the fringe of not being able to meet the food demand that exists today,” says Richard Cruse, a professor of agronomy at Iowa State University.

Yet there may be reason for hope. A new tech-driven trend—some even say revolution—in agriculture is beginning to take hold in Iowa and in many parts of the world. Known as precision agriculture, the practice promises to reconcile mass food production with responsible land stewardship. It is perhaps best summed up by the oft-cited mantra “Doing the right thing, at the right place, at the right time, in the right way.”

The concept harks back to agriculture’s early days, when a farmer, working by hand, could give each plant the exact amount of nutrients, water, weeding, and pest killing it needed without disturbing the soil very much. That era is long gone. But now information and communications technologies are bringing it back. By adopting smart sensors and sophisticated mapping tools, farmers like Mitchell are aiming for the same level of meticulousness as their pick-and-hoe-wielding predecessors—on scales larger than was ever possible before.

Mitchell was one of the first commercial farmers to seriously pursue modern precision farming. Now 40 years old, he shares an engineer’s compulsion to continually improve the less-than-perfect. “When it comes to striving for absolute accuracy, Clay’s as dead-on as they get,” says Paul Schrimpf, a group editor at the trade publisher PrecisionAg.

Mitchell grew up on the same land he still farms with his father and uncle in eastern Iowa, where five generations of Mitchells have made their livelihoods since the 1870s. One of his earliest memories is of helping his father build a system of sensors and hardwired circuits for automating the flow of grain through a series of bins that dry and store the harvest. “As soon as I could turn a screwdriver, my dad would give me 10 cents for every relay I would wire up,” Clay recalls. Those old switches have since been replaced with programmable logic. Today, father and son control the bins through their smartphones.

As a young man, Mitchell took a break from farming to study biomedical engineering at Harvard University. But his heart was in Iowa. When he returned in 2000, he immediately began looking for ways to streamline his family’s operation.

The first challenge he took on was fine-tuning the process of planting and fertilizing seeds. Most of his neighbors would simply broadcast fertilizer over an entire field, till it into the ground, and then drop seeds in neat rows. But the Mitchells wanted to avoid both broadcasting, which wastes fertilizer on unseeded rows, and tilling, which intensifies erosion. So they bought a tractor attachment that cut grooves in the soil and injected fertilizer into the grooves. They figured they could then set seeds directly on top of each nutrient band.

They were wrong. They found it impossible to guide the planter along the same path as the fertilizer applicator. “You just couldn’t visually see where you drove before,” Mitchell says. He pondered solutions involving lasers and cameras. Then one day, he read that civil engineers were starting to use specialized navigation equipment to control earthwork machinery. “And I thought, ‘That’s perfect!’ ”

The product he discovered relies on signals from GPS satellites. But it is magnitudes more reliable than an off-the-shelf GPS unit. Using a signal-processing scheme known as real-time kinematics, or RTK, it can boost the accuracy of a GPS reading from about 3 meters to within just a few centimeters.

Unlike traditional GPS navigation, RTK requires the installation of a local base station, which Mitchell affixed to his parents’ old ham radio tower. He mounted a second receiver on his tractor’s roof. When he’s using the system, the fixed receiver measures the distances to several satellites using GPS signals, compares those to the real distances calculated from its known position, and then reports the errors to the tractor over a radio link. (It sends an update about every 50 milliseconds.) These correction codes enable the tractor to compute its location with greater certainty.

And RTK can refine this calculation even further. A typical GPS receiver determines distance by internally generating the same pseudorandom code that a satellite transmits. Then, by comparing the time shift between the two identical sequences, it calculates the satellite’s range. But because tens of microseconds pass between each bit, the receiver can resolve timing errors with only limited fidelity. RTK solves this problem by also tracking the waveform that carries the code, which cycles at a much higher frequency. This enables the receiver to estimate the start of each cycle with finer precision. Correction signals from the base station then help the receiver determine the exact number of wavelengths between it and the satellite.

Today, RTK and similar systems are most commonly used in agriculture for autosteering, which keeps a vehicle moving autonomously along a straight course. (Drivers must still make turns manually.) Autosteering has become so popular among farmers that tractor manufacturers now build it into all new models. But back then, few growers had heard of RTK. “People out here thought it was absurd,” Mitchell says. “Nobody saw an application for it.”

To Mitchell, RTK was the gateway to ever more sophisticated precision tools. He installed autosteering in his tractor and, with his father’s help, rigged up hydraulic steering and autocontrol in the planter and fertilizer applicator so that the attachments wouldn’t sway off course. He wired up a laptop computer inside the tractor’s cab, and he loaded it with software that maps his fields and keeps track of where he has been and what he has done. For instance, he can use the program to record where he has injected fertilizer and direct the planter to drop seeds in the same spots. The setup is so precise that it allows him to space each seed exactly between last season’s stalks, so that the new shoots won’t struggle to push through the debris.

Of all the Mitchells’ various cost-cutting, yield-boosting inventions, the capstone is their sprayer. Most farmers want to spritz their plants with precisely the amount of pesticides and herbicides that manufacturers recommend. They know that overtreating costs money and can pollute watersheds and damage crops. Undertreating, on the other hand, can breed chemicalresistant superweeds and pests by killing off all but the most hardy invaders. In practice, though, it’s easy to pass over parts of a field twice or to speed through turns, thereby double-dosing some plants while lightly dosing others.

Autosteering helps, but only some. Farmers still must manually steer their sprayers’ giant winglike booms—the Mitchells’ is 18 meters long—around field boundaries and waterways. These are places where spray rates vary wildly, simply because the boom moves faster on the outside of a turn than on the inside. “The errors are so great because the machinery doesn’t compensate,” Mitchell says.

The Mitchells’ sprayer is a John Deere model from 1998. Over the years, they have beefed it up it to do things that even today’s commercial sprayers can’t do. “You can’t just go out and buy this,” Mitchell says. One of the earliest modifications they made was to connect a programmable logic controller to valves that turn the spray nozzles on and off. They linked the controller to a display inside the cab and loaded the computer with digital maps of their fields. Now when Mitchell sprays a field, the RTK-equipped machine automatically clicks off any nozzles that pass over areas he has already sprayed or doesn’t want to hit.

The next challenge was making dose rates consistent along the radius of a turn. First the Mitchells replaced each nozzle with a set of three spouts of different sizes. This way, the machine could toggle among the three rates without having to increase pressure, which could cause droplets to drift. Then they stuck a potentiometer on the machine’s front axle, enabling the control system to calculate the relative speed of each section of the boom and vary its spray rates accordingly.

Yet Mitchell and his father felt the system could be even more exact. Because it relied on RTK readings to determine acceleration, it would switch the spray rate about one second after the sprayer’s speed had actually changed. “There was a lag,” Mitchell says. “So when you’d be speeding up or slowing down rapidly, like you do in small fields, you’d always be putting on the wrong rate.”

The Mitchells solved this problem by feeding data from the driver’s joystick accelerator to the logic controller. Then they programmed the system to anticipate speed changes based on the driver’s commands. So by the time the signals reached the drivetrain, the nozzles had already adjusted. “We’re now getting to the point where every square meter of the field gets the target rate,” Mitchell says.

Many of the precision innovations that the Mitchells have adopted or built themselves are no longer unique. “There was a time it seemed we were dragging everyone along,” Mitchell’s father, Wade, says. “Now we can’t keep up.”

Manufacturers of automated agricultural equipment now offer all sorts of bells and whistles, including various rate controllers and yield monitors. More and more they are taking advantage of wireless technology. John Deere, for example, is developing a suite of products under a strategy it calls FarmSight that let large-scale operators remotely monitor fleets of machines over a cellular network. “The human-technology interface hasn’t yet been mastered in agriculture,” says Tony Thelen, operations director of the company’s Intelligent Solutions Group. “Our efforts are to make data collection and decision support ‘walk-up’ easy.”

Yet worldwide, farmers like the Mitchells are still a minority. And persuading others to embrace precision management, particularly in the developing world, will take more than smarter tools, says Jerry Hatfield, laboratory director of the National Laboratory for Agriculture and the Environment. “I don’t think we’re technologically limited,” he says. “I think we’re human limited—not from an intellectual incapacity but from the will to adopt these things.”

“He is saying precision agriculture is just a theory,” Ezrin Mohd Husin translates for a Malaysian rice farmer who goes by the name Jailani. They stand on the bank of an irrigation canal overlooking Jailani’s paddy fields, where workers are busy “transplanting” little carpets of nursery-grown seedlings into the soaked earth. It is a steamy day in February, and Jailani wears knee-high rubber boots and a brimmed cotton hat. Ezrin, whose collared shirt is embroidered with the words Technology for Precision Agriculture, adds, “He is saying he is a farmer and he knows better than theory.”

Ezrin works for the Smart Farming Technology Research Centre at Universiti Putra Malaysia, a short drive south of the country’s capital of Kuala Lumpur. Its modest staff includes some of the few engineers in the world who have had success bringing precision practices to community farmers. In Malaysia, as in many developing nations, small family farms produce the bulk of the country’s staple crop. But for decades, Malaysian paddy farmers have grown only 70 percent of the rice Malaysians eat. “If grain prices rise globally, or if for any reason Thailand refused to export their rice, we are in trouble,” Ezrin says.

His fears are not merely hypothetical. In January 2008, a sudden spike in food and fuel prices triggered a worldwide cascade of rice hoarding. While exporters such as Egypt, India, and Thailand stockpiled their surpluses, Malaysia and other importers found themselves lacking. Not long after the “great rice crisis” ended that May, the Malaysian government set an ambitious goal to become wholly self-sufficient in rice by 2015.

In five years, though, the country has made virtually no progress toward reducing its dependency on foreign grain. Ezrin and his colleagues believe the reason is simply that the government hasn’t applied the right solution. “If you implement precision farming, it is possible to have full food security,” says Mohd Amin Mohd Soom, the center’s director.

Amin’s approach to precision agriculture is what’s often called variable rate. It differs from the kind of tactics the Mitchells use in that it does not assume every part of a field should be treated the same. Instead, farmers who practice this method take into account variations in soil composition and plant growth to determine exactly how much of something—water or fertilizer, say—each tiny piece needs.

Amin has so far focused on fertility because he believes it can make the biggest impact. Malaysian farmers typically apply chemical nutrients uniformly based on rates the government recommends, he says. His team has been trying to convince them they can get higher yields and save costs if they can figure out which areas should get more fertilizer and which ones should get less.

This isn’t an easy calculation. Farmers first need to be able to measure soil fertility quickly and cheaply, which means they can’t rely on laboratory tests. A simpler solution is to deploy electrical-conductivity sensors. For more than a century, geologists have used these sensors to chart Earth’s interior and to sniff out oil and gold. Around the mid-1990s, growers in the United States and Europe began borrowing the technique to survey their soils.

The most popular mapping system came from a Kansas company called Veris Technologies. It consists of a wheeled frame, which supports an array of platter-size steel disks. As a tractor pulls the cart through a field, the disks penetrate the soil. One pair of disks creates a voltage while a second pair measures how much the current drops as it flows through the ground. By using pairs of disks spaced at different intervals, farmers can gauge conductivity at various depths. And by equipping the system with satellite navigation, they can construct a detailed picture of soil variability—including differences in texture, water-holding capacity, and, to some extent, nutrient contents.

In 2004, Amin bought Malaysia’s first Veris machine with money from a government grant. The pilot study included only eight farmers, whose paddies surrounded the coastal community known as Tanjung Karang. Amin’s team mapped the electrical conductivity of the farmers’ fields and took several hundred soil samples over many seasons. Each season, the researchers used the results to model the relationship between conductivity and the amount of essential nutrients in the soil. Then they used the model to create color-coded maps showing farmers how best to distribute fertilizer. In some seasons, the model represented a dependence that was statistically significant. In other seasons it didn’t, but they used it anyway. “It’s not 100 percent accurate,” admits Aimrun Wayayok, a soil expert at the research center.

This unreliability is why some farmers, including the Mitchells, are skeptical of variable-rate methods. “Sprinkling a little bit less phosphorus on one part of the field and a little more on another doesn’t have a predictable benefit,” Mitchell argues. “It’s a very small Band-Aid.”


Aimrun disagrees. Electrical conductivity may not be the perfect predictor of soil fertility, he concedes, but it’s a start. And the maps have encouraged farmers to be more conscientious of how much fertilizer they use. In some cases, Aimrun says, they have cut their use by as much as 70 percent. “We are showing them that reducing fertilizer doesn’t necessarily reduce yield and sometimes increases it,” he says.

“Agriculture systems are very complex,” says John Schueller, an expert in precision farming at the University of Florida. “It is impossible to optimize completely. Even in Iowa, where farmers have access to the best technologies, they’re really not hitting absolute perfect conditions. So the question is: How close can you get? Usually doing something is better than doing nothing.”

Today, about 30 farmers in Tanjung Karang are enrolled in the center’s precision program. And in the nine years since the project began, the area has upped production from 4 metric tons per hectare (the national average) to more than 6 metric tons per hectare.

It is arguable that a big driver of this success has been broadband access. In 2008, the center oversaw the construction of an Internet café in Tanjung Karang. Here, Amin and his staff teach farmers in the community how to use computers and browse the Web. They’ve also created a custom website where the farmers can access fertility maps for their fields and track their activities throughout the growing season.

On this February day, the café is quiet and airy. Its picture windows frame undulating landscapes of tall green rice grass. A few teenagers sit at terminals Facebooking and playing video games. On a wall, someone has tacked a poster-size copy of a check for 50 000 Malaysian ringgits (about US $16 000). Printed on the bottom in Malay is the declaration “Best Paddy Field Management, Selangor AgroFest 2011.”

Eventually, a farmer named Aziz wanders into the café. He points to the check. “He says he is proud,” Ezrin translates. “He says previously, he protested precision farming. He could get 7 [metric] tons per hectare, which is very good. But then he tried precision farming and now he gets 9 tons—2 tons extra! And with half the fertilizer bags!”



Hands-free tech not safe for driving, study shows

Dayton Daily News

Posted: 12:05 a.m. Thursday, June 13, 2013

By Lou Grieco

Staff Writer


Hands-free technologies for making phone calls or sending texts are not any safer for drivers, whose abilities decrease as distractions increase, according to a study sponsored by the AAA Foundation for Traffic Safety.

“People go into what we like to call ‘tunnel vision’ as they become more distracted,” said Cindy Antrican, public affairs manager for Dayton’s AAA office. “Their brains stopped thinking about what they were doing.”

The study, released Wednesday and done by cognitive distraction experts at the University of Utah, measured brainwaves, eye movement and other metrics to see the impact to drivers as they attempted different tasks. The study found that listening to the radio was a minimal risk. Talking on a cellular phone, whether hand-held or hands-free, was rated a moderate risk.


But listening to and responding to voice-activated email or texting features was rated an extensive risk, and citing a predicted five-fold increase in infotainment systems in new vehicles by 2018, AAA has called for auto manufacturers to add safety features, such as disabling voice-to-text technologies while the vehicle is in motion.

Already, people are dictating texts, emails or social media posts while driving, using voice-activated programs installed in newer cars. IHS Automotive reported that, in 2012, 80 percent of the new cars sold in North America and Europe included some type of voice-activated controls. But the study found that, the more distracted drivers became, the less they checked mirrors, looked around or processed objects they encountered.

“They couldn’t see things right in front of them,” Antrican said. “Things like pedestrians. Things like stop signs.”

Texting has already been a concern. As of January, 39 states and the District of Columbia have enacted texting bans. Ohio’s ban went into effect in August. But only 10 states, plus Washington, D.C., have banned hand-held cellular phone use. AAA states that those bans should go even further.

“We have long thought, and this is a commonly held perception, that hands free is safer,” Antrican said. “I think we just proved that that perception is not accurate.

Ohio State Highway Patrol spokeswoman Lt. Anne Ralston said that operating a vehicle is a complex behavior, requiring cognitive and physical abilities to be in sync with each other. Any distraction, “whether it’s daydreaming, texting, using your hands-free version or talking to someone in the car” can affect those abilities, particularly when there are several of them, Ralston said.

“Limit those distractions, put the phone down, keep your mind focused on operating that vehicle safely,” Ralston said. “Make that phone call or send that text message when you get to your destination safely.”


As Nellis grounds aircraft, training goes virtual

by Staff Sgt. Gregory Brook

99th Air Base Wing Public Affairs


6/10/2013 – NELLIS AIR FORCE BASE, Nev. — The skies over southern Nevada are quieter than they have been in quite some time due to the June 1 Air Combat Command directed stand down of flying operations.

Despite the stand down, the 64th Aggressor Squadron remains committed to accomplishing their mission, said Lt. Col. Michael Shepherd, 64th AGRS academic assistant director of operations.

“Our motto is ‘know, teach and replicate,'” Shepherd said. “As Aggressors, we are subject matter experts in a field of adversary tactics or systems anywhere from airplanes to missiles to actual tactics to electronic attacks.”

The 64th AGRS is assigned to the 57th Adversary Tactics Group at Nellis Air Force Base, Nev. Their primary mission is to provide support to the U.S. Air Force Weapons School, Red Flag exercises, the various test and evaluation squadrons and to provide training to units in the Combat Air Force on adversary tactics.

The 64th AGRS plans to focus on learning as much as possible about adversary tactics and providing training to the rest of the CAF due to the reduction in flight hours.

“We will continue to seek out the most current intelligence and update our briefs so we can disseminate that information to the CAF as much as we can,” Shepherd said. “By no stretch of the imagination are we just shutting down our squadron.”

There are plans in place for the 64th AGRS to work together as a team with other Nellis units to gain valuable insight and experience, Shepherd said. The U.S. Air Force Weapons School’s 16th Weapons Squadron, which teaches the F-16 Fighting Falcon weapons course, will work together with the 64th AGRS.

There is an academic agenda in place for the summer, Shepherd said. The 16th WPS will refresh the 64th AGRS on the tactics currently employed by the CAF and the 64th AGRS will reciprocate by teaching the 16th WPS the Aggressors’ academics. Every week a member of the 64th AGRS will certify as a subject matter expert in one of their assigned adversary equipment and tactics categories.

“We have a robust schedule throughout the summer. We will still be busy; it will just be a different kind of busy than we have become used to,” Shepherd said.


The pilots of the 64th AGRS will also try to stay current in their flight ratings using flight simulators and limited flying in support of the 422nd Operational Test and Evaluation Squadron, according to Shepherd. Many pilots will lose most of their currencies and will have to regain them in October. Simulators will be used to try and mitigate the lack of practice.

The 64th AGRS pilots will concentrate on practicing emergency procedures in the simulators.

“They cannot afford to lose proficiency in carrying out those procedures because they need to be able to fly and land safely,” Shepherd said.

There are few flight currencies that can be updated in simulators.

“While we can’t officially update them, we can still use the simulators to stay proficient,” Shepherd said.

“In order to keep ourselves sharp and keep our mindset, we have adopted a kind of back to basics mentality, said Capt. Paul Anderson, 64th Aggressor Squadron B-flight commander. “We just went over and did the first run in the simulators today to establish how we want to use them. It was really good; we got to see some of the challenges others are facing and learn how to better challenge our customers and improve their learning and training.”

64th AGRS pilots are flying against recent adversary tactics in simulators to gain a greater perspective of what CAF units’ experience in simulators as well.

“It validates our credibility as Aggressors,” Shepherd said. “We have to stay as current in the tactics and knowledge of things as we possibly can and of executing our mission in new ways.”

“The ideal way to train is to combine academics with flying, but since we are not flying we are offering as much academics as we can,” Anderson said. “You are able to talk about a threat and then go and see how it is actually employed. It drives home the point.”

In addition to staying professionally proficient in their own fields and teaching others through traditional in-person briefings, the 64th AGRS will use innovation and technology to accomplish their mission.

“The technology gives us a greater ability to get the information out there,” Shepherd said. “We have something called Virtual Flag. It’s like Red Flag, but in simulators. Everyone taps in, and we can be Aggressors in our simulators here. We can fly against the guys in [Royal Air Force] Lakenheath, England. I don’t think simulator training will ever be able to fully encompass what you get in real life, but it is good training.”

The goal is to continue to give the CAF the training needed to go out and fly, fight and win without sacrificing safety or airmanship, according to 64th AGRS leadership.

“One of my best directors of operations, when we were getting ready to deploy to Iraq, his mantra was embrace the pain,” Shepherd said. “I think that’s very relevant. It’s hard when you deploy, and it’s hard when you suddenly have all of your flying hours taken from you. Embrace it and make a positive out of it whatever way you can.”

“Know, teach and replicate,” Anderson said. “We are still trying to carry out our mission.”





Wave of federal retirees to hit government

By Jennifer Liberto @CNNMoney June 13, 2013: 6:29 AM ET



The U.S. government could soon be facing a shortage of workers.

Some 30% of the federal workforce will be eligible to retire in the next three years, according to a Government Accountability Office report. That means the government could be hit by a wave of retirements at a time when it is already dealing with inadequate staffing, because large numbers of employees are on furlough and hiring freezes are in place to save money and dramatically shrink the federal budget deficit.

It could lead to disruptions in some key areas. Some 46% of air traffic controllers can retire in the next three years, creating a challenge for the government to replace them with similarly experienced workers. It’s a worry, especially at a time when the public has fresh memories of long delays at the nation’s airports when the sequester kept controllers at home on furlough.

Also, more than a third of all federal veterinarians are eligible to retire. An impending national veterinarian shortage is already fueling fears that the U.S. Department of Agriculture will not be able to easily tamp the spread of a severe disease outbreak in animals, according to several federal reports.

Congressional watchdogs have been warning for years about this “tsunami” of baby boomers hitting retirement age at federal agencies. So far, it hasn’t happened because more workers hung on to their jobs far longer than expected, in part due to the recession.

That may be changing this year. In the first four months of 2013, some 60,000 employees filed for federal retirement benefits, a 43% increase from last year, according to the Office of Personnel Management. A major part of the spike comes from postal workers taking early retirement buyouts, OPM says.

But union groups and employee advocates say the uptick in retirements is being fueled by the $85 billion in spending cuts, along with its furloughs, pay freezes, slashed overtime and increased work load from the lack of new hires.

“We’re seeing this massive brain drain as thousands of workers leave the federal workforce, and it’s leading to a huge loss of knowledge and expertise,” said William Dougan, president of the National Federation of Federal Employees. “A lot of these jobs are not being filled once they’re vacated.”

It’s already leading to questionable outcomes. In the past two years, the beleaguered Internal Revenue Service has lost 10,000 employees to retirement and attrition. Some have suggested that the lack of senior level employees may have played a role in the current scandal plaguing that agency.

More than a third of IRS employees will soon be eligible to retire, which could also lead to longer waits for things like taxpayer refunds and audits.

Federal employees can retire at the age of 62 if they have worked five years, or even younger with 20 years of service. The average federal worker is 47 years old with about 14 years on the job, compared to the median age of 42 for all American workers, according to federal data.

Over at the USDA, the labor crunch poses a risk to the nation’s food supply as early as this year, according to Michael Gilsdorf, executive vice president for the National Association of Federal Veterinarians.

At least three USDA agencies are struggling to fill positions left by outgoing vets since federal budget cuts took hold, Gilsdorf said. That has shrunk the number of “deployable” veterinarians, who can respond to an animal disease outbreak from 1,000 vets last year, to about 250 now.

A federal advisory panel has warned there aren’t enough federal veterinarians to contain a massive multi-state outbreak of foot and mouth disease, a highly contagious animal disease which strikes cows, goats and pigs.

“Our capacity to respond to a disease outbreak has hit a critical stage,” said Gilsdorf, who runs the panel tasked with helping craft the plan to recruit thousands of private and public sector veterinarians to fill in gaps if disease hits the nation’s food supply.

The USDA responded by saying it has recruited 955 veterinarians and 3,100 animal health technicians to join a volunteer group of emergency responders called the “National Animal Health Emergency Response Corps.” The agency’s has a special program that trains vets, including those at nonprofits and in private practice on how to respond to a national emergency.

A USDA spokesperson said the agency prioritizes “working with staff and accredited veterinarians to ensure that we can respond quickly and in a manner that is highly protective of our animal agriculture resources.”

For some, however, a shrinking federal workforce is not something to worry about. In fact, it is a good thing — it’s a key part of Republican budget chief Rep. Paul Ryan’s plan to shrink deficits. This year, Ryan proposed cutting the 2.1 million workforce by 10% through a two-year hiring freeze.

An OPM official said it’s prepared for the wave of retirements by “constantly evaluating workforce preparedness programs.”

While sequester has been an “added challenge,” the official stressed “the decision to retire is a very personal one, as each person looks at their individual situation and decides what is best for them.”


Obamacare: Is a $2,000 deductible ‘affordable?’

By Tami Luhby @Luhby June 13, 2013: 6:23 AM ET


Until now, much of the debate swirling around Obamacare has focused on the cost of premiums in the state-based health insurance exchanges.

But what will enrollees actually get for that monthly charge?





States are starting to roll out details about the exchanges, providing a look at just how affordable coverage under the Affordable Care Act will be. Some potential participants may be surprised at the figures: $2,000 deductibles, $45 primary care visit co-pays, and $250 emergency room tabs.

Those are just some of the charges enrollees will incur in a silver-level plan in California, which recently unveiled an overview of the benefits and charges associated with its exchange. That’s on top of the $321 average monthly premium.

For some, this will be great news since it will allow them to see the doctor without breaking the bank. But others may not want to shell out a few thousand bucks in addition to a monthly premium.

“The hardest question is will it be a good deal and will consumers be able to afford it,” said Marian Mulkey, director of the health reform initiative at the California Healthcare Foundation. “The jury is still out. It depends on their circumstances.”

A quick refresher on Obamacare: People who don’t have affordable health insurance through their employers will be able to sign up for coverage through state-based exchanges. Enrollment is set to begin in October, with coverage taking effect in January. You must have some form of coverage next year, or you will face annual penalties of $95 or 1% of family income (whichever is greater) initially and more in subsequent years.


Each state will offer four levels of coverage: platinum, gold, silver and bronze. Platinum plans come with the highest premiums, but lowest out-of-pocket expenses, while bronze plans carry lower monthly charges but require more cost-sharing. Gold and silver fall in the middle.


The federal government will offer premium subsidies to those with incomes of up to four times the federal poverty level. This year, that’s $45,960 for an individual or $94,200 for a family of four. There will be additional help to cover out-of-pocket expenses for those earning less than 250% of the poverty line: $28,725 for a single person and $58,875 for a family of four. The subsidies are tied to the cost of the state’s silver level plans.


Related: I’m signing up for Obamacare


California offers insight into how much participants will actually have to pay under Obamacare. The state, unlike most others, is requiring insurers to offer a standard set of benefits and charges in each plan level. The only variables are monthly premiums, doctor networks and carriers in your area.


For those in need of frequent medical care, the platinum or gold plans would reduce out-of-pocket costs for treatment. These plans have no deductible, and doctors’ visits and medication are cheaper. But the trade-off is that they have higher monthly premiums. California has not yet released the premium range for these tiers.


On the flip side, a young man who never visits the doctor and wants to minimize his monthly charge could opt for a bronze plan. A 40-year-old enrolling in this plan could pay as little as $219 a month. But, if he did get sick, he’d get socked with a $5,000 deductible, $60 co-pays for primary care visits and a $300 emergency room charge.



Obamacare provides protection for those who need a lot of care by placing a cap on out-of-pocket expenses. The maximum a person in an individual platinum plan will spend a year is $4,000, while those in the other tiers will shell out no more than $6,400.
“Insurance is expensive. It’s hard for anyone who isn’t well off to afford it,” said Gary Claxton, director of the health care marketplace project at the Kaiser Family Foundation. “But it is good enough that you can afford to get sick without bankrupting yourself.”

Whether potential enrollees find these plans affordable will depend on how healthy they are and whether they are currently insured.

Many individual insurance offerings currently available come with much higher deductibles, cover fewer expenses and limits on how much they’ll pay out in a year. Plans on the exchange, on the other hand, are required to cover a variety of “essential benefits,” including maternity care, mental health services and medication.

“In many cases, depending on the plan, the coverage will be more comprehensive than what the enrollee currently has,” said Anne Gonzalez, a spokeswoman with Covered California, which is running the state’s exchange. To top of page




How the NSA Could Get So Smart So Fast

Modern Computing Is Helping Companies and Governments Accurately Parse Vast Amounts of Data in a Matter of Minutes.

Updated June 12, 2013, 7:51 p.m. ET




Five years ago it would have been unimaginable for a government agency such as the National Security Agency to efficiently parse millions of phone, text and online conversations for keywords that could have warned of an impending terrorist attack. Today, a set of new technologies make it relatively affordable and manageable for it do so.

Five years ago it would have been unimaginable for a government agency such as the National Security Agency to efficiently parse millions of phone, text and online conversations for keywords that could have warned of an impending terrorist attack. Today, it’s much easier. Michael Hickins joins the News Hub. Photo: AP

These technologies can store vastly different types of data in a single database, and can be processed rapidly using inexpensive hardware, without an analyst having to formulate a hypothesis. “They’ve substantially reduced the cost and greatly increased the [government’s] ability to analyze this type of data,” says Tom Davenport, an expert on analytics and a visiting professor at Harvard Business School. The technology needed to outfit data centers to perform these tasks has become “orders of magnitude” less expensive than in the past, he said.

It is unclear exactly what type of computing the NSA is using in its data-center facilities around the U.S., or in a $1.2 billion facility in Utah that will open this fall.

But broadly speaking, the technology can be broken down into three categories:


Database systems

Traditional databases, usually written in a language known as SQL (pronounced sequel), store data in tables, columns and rows but are limited when it comes to storing strings of words such as those found in an email or text message. They also can’t handle pictures or video.

New types of databases that emerged beginning in late 2009, known collectively as NoSQL (for “not only SQL”), such as MongoDB, Cassandra and Simple DB, don’t have these limitations, and allow analysts to create queries against all these types of data.

NoSQL databases can make a huge difference to companies analyzing very large data sets, even if they’re fairly conventional. For example, analysts at risk consultancy Verisk Analytics Inc. are “constantly running different models and analytics” against billions of customer records in order to help identify fraudulent insurance claims.

Perry Rotella, vice president and chief information officer at Verisk, says using a traditional DB2 database from International Business Machines Corp., “would be a six-hour job” that had to run overnight. Analysts would pore over the results and generate new queries that would again have to run overnight. He said it took weeks every time analysts needed to create a new statistical model. The company recently changed to a NoSQL database that allows analysts to run the same types of queries in 30 seconds.


“So all of a sudden your model-building becomes iterative in real-time instead of over days. [Using NoSQL], you can run analytics on your data multiple times a day, and it compresses your ability to get results from weeks into days. It’s extremely powerful,” he said.

For online businesses like photography marketplace Shutterstock Inc., which store a great variety of file types, it is difficult to imagine life without this technology. Shutterstock has a library of more than 24 million images and adds an additional 10,000 each day, each of which has associated data to help narrow search results. Its databases also record everything that users do on the site—not just decisive actions such as what images they license, but also minute details such as where they place their cursor and how long they hover there.


Machine learning

Traditional analysis requires analysts to have enough understanding of the data to form a hypothesis and then create complex queries to run against the database. Recently developed programs known as machine learning and natural language processing rely on the computer programs themselves to find patterns and even elucidate the meaning of ambiguous words based on context. “You can turn a machine-learning program loose on a lot of data and you can see what they are able to be predictive of,” said Mr. Davenport. With natural language processing, “you could figure out whether a term like ‘bomb’ is being used to describe a Broadway play versus something a terrorist would use,” he said.

Machine learning, also known as cognitive analytics, allows queries to continually “tune themselves,” Gartner Inc. analyst Douglas Laney explains. For example, retailers use this technology to automatically update pricing algorithms in real time as new information, such as weather, time of day and even information gleaned from video of customers browsing in their stores become available. “It used to take more than a day to update pricing, but these retailers can reprice every hour and use trending information to do real-time product pricing,” says Mr. Laney. “I’m not sure they could do that even a year ago,” he said.



Until recently, complex computer programs needed to run on expensive hardware, such as enormous mainframe computers. Today, an open-source software framework called Hadoop—developed at Yahoo Inc. with contributions from technology developed by Google Inc. and named after a child’s toy elephant—allows queries to be split up by the program, with different analytic tasks distributed among scads of inexpensive servers, each of which solves a part of the puzzle, before reassembling the queries when the work is completed. “It’s really cheap and really fast,” said Mr. Davenport.


The ability to distribute complex queries to a large number of inexpensive computers helps people get very quick responses to complicated questions with a large number of variables. For example, online automotive market Inc. can help auto dealers predict how long a given car will remain on their lots by comparing car makes, models and trim against the number of days inventory cars at that price point averaged on a lot in a given dealer’s region. The predictions help minimize the number days a car remains unsold—”one of the most important sales metrics for dealers,” said Philip Potloff,’s chief information officer.

Video-streaming company Netflix Inc. uses Hadoop to graph traffic for every type of device people are using to access video across multiple markets, allowing the company to improve the reliability of video feeds on mobile devices, laptops and TVs, and plan for future growth of streaming movies and TV shows. It also helps Netflix to better analyze customer preferences so that it can make improved recommendations.


Pentagon Preparing Two Budget Plans for Fiscal 2014

By Kedar Pavgi

June 12, 2013


The Pentagon is planning for two scenarios in fiscal 2014 because the outcome of budget negotiations between the White House and Congress remains uncertain, a senior Defense Department official said on Wednesday.

Deputy Defense Secretary Ashton Carter said the department was “prepared for a wide range of budget contingencies,” adding that the department’s approach next year would depend on how Washington sorted out its fiscal situation.

One plan will be based on the Obama administration’s fiscal 2014 budget request and the other assumes Washington continues to be mired in “fiscal gridlock,” Carter said during an annual conference hosted by the Center for a New American Security.

Carter said the Pentagon’s Strategic Choices and Management Review, the budget evaluation ordered by Defense Secretary Chuck Hagel in March, had been completed and awaited review by Hagel’s office. The SCMR will be the “basis” for the fiscal 2015 budget request, and will also help with fiscal 2014 budget decisions, Carter said. A recent memo from his office directed Defense managers to plan for a possible 10 percent budget cut in fiscal 2014.

Still, Carter lamented the impact that sequestration was having on Defense’s budget and planning efforts.

“Despite our best efforts to minimize this damage, it is at a minimum, embarrassing, to be doing this in the eyes of friends and foes alike,” Carter said.

Defense must complete and submit a plan to make good on a promise to the leadership of the Senate Armed Services Committee. In May, Sens. Carl Levin, D-Mich., and Jim Inhofe, R-Okla., wrote to Hagel demanding a “package of reductions” by early July to help plan for possible sequestration cuts in fiscal 2014.

Carter also said he “aligned” himself with the recent shadow budget reviews conducted by CNAS and a host of other Washington D.C. think tanks. The organizations said that sharp reductions to the department’s civilian personnel would be necessary to help realign resources in areas such as the Asia Pacific and cybersecurity.


USAF awards Logos Technologies a $49.7 million contract to ‘tame Big Data’


Wed, 2013-06-12 08:39 AM





Logos Technologies LLC has been awarded a $49,750,000 Small Business Innovative Research (SBIR) Phase III indefinite delivery/indefinite quantity (IDIQ) contract from the U.S. Air Force Research Laboratory Information Directorate (AFRL/RI) in Rome, NY.


The award has a five-year period of performance, says a Logos news release issued on June 11.


The contract will facilitate multi-sensor tracking and the rapid extraction of intelligence from large data sets, allowing operators to predict target set behavior.


The Processing Exploitation and Analysis of Large Data Sets (PEALDS) program will leverage the work of Logos Technologies in Intelligent Persistent Surveillance (IPS) to better collect, catalogue, process, exploit and disseminate different streams of data from different sensors, such as signals intelligence (SIGINT), full motion video cameras and wide-area surveillance systems.


“Intelligent Persistent Surveillance moves beyond sensors and focuses on getting useful information to the warfighter in real time,” said Greg Poe, CEO of Logos Technologies. “IPS relies on sophisticated tools and methodologies, including Activity-Based Intelligence, which looks at how events interact with each other.”


PEALDS is a Small Business Innovation Research Phase III program. The contract will build on earlier efforts, such as Vast Imagery Streaming Technology (VIST). The VIST software is used in the Logos-developed Kestrel system. Kestrel is the only operational wide-area surveillance system for tethered airships, says the release.


“We are constantly looking at how to improve the intelligence product of sensors,” said Frank Purdy, director of the national security services division at Logos Technologies. “Working with multiple collection systems and data sets provides the knowledge to move us from reactive to predictive analytics.”


In January, Logos Technologies formally entered into a Cooperative Research and Development Agreement (CRADA) with AFRL/RI entitled “Exploitation and Analysis of Various Large Data Sets.” Like the PEALDS program, the CRADA will help advance U.S. Air Force data exploitation and tracking capabilities.



Turkish police shoot down unmanned aircraft above protest

13 June 2013

By Matthew Schroyer


Over the past two weeks, thousands of protesters have rallied at Gezi in Turkey’s capital, Ankara, against the government’s plans to raze the public park and turn it into a commercial zone and military barracks. Police have responded with tear gas, water hoses, and batons. The sudden explosion of political dissatisfaction has shocked the usually peaceful country, along with much of the world.

One Turkish citizen happened to be equipped with a small, RC helicopter with video recording capabilities, which he flew above the heads of protesters and police to gather close-up, aerial footage. The man’s video, posted on Vimeo, shows groups of police attempting to disperse crowds with gas and pressurized water.

Protesters can be seen scrambling to find cover at Taskim Square, in Istanbul. A fire in the distance sends smoke into the air.

According to the unmanned helicopter’s pilot, police shot the aircraft from the sky, which was destroyed completely on impact with the ground. The pilot was unable to recover video of the incident, but was able to obtain video leading up to that point.

Unmanned aircraft systems, or UAS, more commonly called “drones” in the media, are increasingly being deployed by activists. Over the past three years, they have appeared over protests in Russia, Poland, Estonia, and Argentina. In February, 2012, a group of animal rights activists were recording a pigeon shooting event in South Carolina with their own unmanned aircraft, when it was shot down by hunters.


IEEE Spectrum

The Better Meat Substitute

Can new and improved meat analogues help us control our damaging desire for animal flesh?

by Glenn Zorpette

Posted 3 Jun 2013 | 20:18 GMT

In one typical minute on this earth, people kill at least 109 339 chickens, 2630 pigs, 922 sheep, 781 goats, and 557 cows, plus a very large number of ducks, horses, turkeys, and rabbits, as well as three camels. And one singularly unfortunate mule. Add it all up and every week well over a billion animals are slaughtered for food, according to the Food and Agriculture Organization of the United Nations.

You may or may not find the death of so many animals unsettling, but it’s hard for anyone to ignore the undisputed environmental and health consequences [PDF] of this massive meat consumption. They include higher rates of greenhouse-gas emissions, water pollution, deforestation, and biodiversity loss. And possibly colorectal cancer as well.

No rational person can doubt that the world would be a better place if a significant fraction of carnivores simply gave up meat. But nothing in the history of human forbearance suggests that’s ever going to happen.


So let’s consider a slightly less obvious countermeasure. That would be to replace some proportion of meat with something that looks, tastes, and feels like meat but whose production doesn’t require tens of billions of animals, many of them hormonally augmented, that are 1) occupying more than 26 percent of the planet’s land area; 2) eating valuable crops; 3) drinking extremely large quantities of water [PDF]; and 4) producing dizzying volumes [PDF] of the greenhouse gas methane. Although the retail costs of meat aren’t high, the environmental ones certainly are.

Vegetable substitutes textured to resemble cooked animal flesh have been around for decades. But they’ve never been very enticing—a fact borne out by the limp sales figures. In the Netherlands, sales of such replacements hit a peak of around 1 percent by volume in comparison with meat, according to a 2004 Dutch study. That bump in popularity occurred after one of a series of gruesome meat-safety crises hit Europe. In the United States, sales of meat substitutes in 2011 totaled US $277 million—which was around 0.2 percent of what was spent on meat, according to a report in The Wall Street Journal and data from the American Meat Institute.


Now, though, a new crop of greatly improved meat substitutes is becoming available in Europe and the United States. Some of the new products are from small start-ups that are taking on huge food conglomerates whose subsidiaries have dominated the market (albeit small) for meat substitutes.


The new products include Beyond Meat and Match in the United States and Plenti in the Netherlands. All are produced with technology or advice from university laboratories: Beyond Meat from the University of Missouri; Plenti from Wageningen University & Research Centre in the Netherlands; and Match Meat from the University of Illinois at Urbana-Champaign. That technology is shaking up this long-placid backwater of the food industry is underscored by the fact that another highly touted new company now quietly readying its first faux-meat offering is a Silicon Valley start-up, complete with a Stanford professor, Patrick O. Brown, at the helm.


The new generation of meat substitutes have a texture and mouthfeel that matches the real thing remarkably well. The older offerings don’t do that quite as convincingly. All of the new substitutes come with anecdotes about diners—and even professional food critics—believing they were eating real meat.


While researching this article I sampled Plenti, Match, and Beyond Meat. I found the three to be quite different from one another, and yet all were impressively meatlike [see “A Consumer Guide to Fake Meat.” They’re a big improvement over more traditional meat stand-ins, such as those available in the United States and Canada under the brand names Yves, Tofurky, and Boca. It’s possible I would have been fooled had they been sprung on me unawares (especially if accompanied by a couple of glasses of wine).


Every now and then technology and social trends align to offer something sublime as well as important—like making possible a kinder and more enlightened paradigm for the food-processing industry. The food conglomerates have taken a beating lately as news articles have described their long-standing propensity to churn out nutritionally bankrupt foodstuffs loaded with the unholy trinity of salt, sugar, and fat. And yet, it’s just possible that the technology these companies wield could be put to uses that are not only profitable but also beneficial to people’s health.

Historically, healthy food has been a niche that the big food companies were content to leave to small subsidiaries or scrappy independents. But times are changing. Hundreds of millions of aging baby boomers are reevaluating their diets after being treated for cancer, heart disease, or other illnesses. And meat is a big business—$160 billion in the United States in 2010, according to the American Meat Institute. If emerging techniques allowed a company to make meat substitutes that were good enough to capture even a few percent of that market, it would still mean the kind of business that a conglomerate could love.


“Say you swapped 5 percent of meat for vegetable protein,” says Vaclav Smil, an environmental scientist at the University of Manitoba, in Canada. “Nobody would even notice. But it would make a difference on this planet.”

Eat a hamburger and you’ve just used about 2500 liters [PDF] of water, or enough for 16 baths in an average tub. And for each kilogram of beef consumed, as many as 35 kilograms of carbon-dioxide-equivalent greenhouse gases are released—about the same as what comes from a reasonably efficient car driving almost 275 kilometers.

Think of livestock as a means of converting vegetable protein into animal protein. To make livestock grow fast, producers feed them a diet rich in legumes and grain, usually soy, corn, and wheat—all of which contain protein that people could otherwise be eating. For cows, the conversion process is particularly inefficient. In Europe, where beef cows are fed a diet high in soy, it takes about 8 kilograms of soy protein to produce 1 kilogram of beef protein. So it is eight times as efficient to get your protein from soy than from beef. And that’s before you account for the enormous water and energy use, greenhouse-gas emissions, deforestation, and the rest that come from raising so many animals.


Today, meat eating varies among developed nations, ranging from about 45 kg per person per year in Japan (not including seafood, which is particularly popular there) to about 125 kg per person per year in the United States. Much of that meat is produced in “factory farms,” in which tens of thousands of animals are raised. The confined feedlots you find there are sometimes awash in animal waste that often runs off into streams and rivers, although it is not supposed to.


Factory farms have driven meat prices down to historically low levels. In the United States, families spend an average of 9.8 percent of their disposable income on food, according to the U.S. Department of Agriculture. “People spend more on entertainment,” Smil says. “No country in human history has paid so little for food as a percentage of disposable income,” he adds.


Though it has plateaued in developed countries, per capita consumption of meat is on the rise globally and is expected to keep increasing for a couple of decades more. Worldwide, meat production was 110 million metric tons in 1975. By the end of 2012, the Food and Agriculture Organization estimates, it had risen to 302 million. Meat consumption will keep increasing for the foreseeable future, driven by growth in both population and disposable incomes in countries like China. Thus meat production could reach 465 million metric tons in 2050, the FAO estimates, when world population is predicted to exceed 9 billion [PDF] (it is a little over 7 billion today).


Smil marshals a barrage of statistics to support his contention that the world could indeed produce 465 million metric tons of meat per year. But the strain on the world’s freshwater supply, environment, climate, and health-care systems would be noticeable, to put it mildly.


A satisfying meat alternative could do enormous good. “We need sophisticated technology to achieve this,” says M.A.J.S. “Tiny” van Boekel, dean of education at Wageningen UR. “If you really want to achieve sustainability, you have to work on technology. People don’t expect that. To some, sustainability is a cow in a meadow and chickens running around. But that’s not how we are going to feed 9 billion people.”


Photo: Beyond MeatAll-Plant Kebab: State-of-the-art extrusion technology gives Beyond Meat the look and feel of chicken.


How do you make vegetable protein taste like animal protein? By using an industrial-age technology called extrusion.


You may have had your first experience with extrusion before you were out of diapers, as a toddler playing with a toy called the Play-Doh Fun Factory. You pushed Play-Doh through a die to make Play-Doh strips that had the cross section of a star or a circle. The basic technique was first used more than 200 years ago to make lead pipe, and then wire, and still later, spaghetti. Extruders now produce breakfast cereal, puffed snacks, and pet foods.


The use of vegetables, nuts, and grains to make meatlike foodstuffs goes back to the 19th century, at least. In Asia, some meat substitutes grew out of Buddhist traditions, which eschew the consumption of meat, particularly in a temple. But the enterprise took a quantum leap in the late 1950s and early 1960s, when the earliest U.S. patents were issued for the use of extrusion to make “meat analogues” from vegetable proteins. Many companies were involved, including Archer Daniels Midland, the extrusion-machine maker Wenger, A.E. Staley, Cargill, Miles Laboratories, and Worthington Foods.


There are dozens of meat replacements on the market in Europe and a similar number in North America. Different products make use of different kinds of proteins or combinations of proteins, including wheat gluten and a protein derived from a fungus. But the majority are based on soy protein, sometimes in combination with other vegetable proteins, which has been extruded to give it a fibrous, meatlike texture.


Jeroen Willemsen, a cofounder of the Dutch company Ojah, in Ochten, which makes the meat alternative Plenti, points to three generations of meat substitutes. The first was a “tofu generation,” based on the bean curd. These products didn’t taste much like meat but had reasonably high levels of protein. The second generation features ready-to-use items that resemble sausages, meatballs, hamburger patties, and the like. The key ingredient is texturized vegetable protein. It is produced by extrusion of a low-moisture precursor and then dried into flakes or granules. These are rehydrated and incorporated into the final, ready-to-use product, which is sold frozen or chilled.


The third generation, which includes Plenti and Beyond Meat, are a special category known in the food industry as high-moisture meat analogues, or HMMAs. They are designed to have the taste and mouthfeel of genuine muscle meats— chunks of chicken breast, a piece of smoked eel, or shredded pork. The new analogues are meatlike enough to be sold in much the same way as meat—refrigerated and intended for use in soups, sandwiches, chilies, burritos, and other dishes where sauces and seasonings will help enhance the meaty illusion.


Being a relatively recent development, HMMAs occupy just a thin slice of today’s meat-substitute market. Researchers extruded the first HMMAs in the 1980s, following pioneering work at the Protial research and development laboratory in Angers, France. But a flurry of patents in just the past seven years confirms that tinkering over the last decade has greatly improved the appeal of these foods.


Here’s the challenge, according to Ethan Brown, the founder of Beyond Meat: “The difficulty is in matching perfectly the tension between biting cleanly through the product, with just the right amount of push-back, and the fiber structure.”


Fortunately, to meet that challenge you have a marvel of modern industry, the twin-screw thermoplastic food extruder [PDF]. It precisely applies heat, pressure, and mechanical shear forces to a foodstuff, typically a floury mix rich in protein, to transform it in some way. The mixture moves through the machine in a flow, being altered continuously as it does so.


For an HMMA, the processing begins with a mix of protein powders derived from legumes and possibly some grains or seeds, along with starch, binding agents, flavors, and colors. They all go into a single hopper. From there, they move into the preconditioner or feed zone, where the protein powders are mixed with finely sprayed water and oil. The oil brings the mixture up to a fat proportion resembling that of meat, at least 4 or 5 percent. The gooey mixture is moved along by a pair of corotating, intermeshing steel augers. These mix and knead the powder, water, oil, and other ingredients into a dough while pushing it from the preconditioner into the extruder’s main barrel, a tube about 2 or 3 meters long, where the dough begins its transformation into a fibrous, meatlike substance.

As the augurs pull and push the dough through the extruder barrel, heat and pressure and shear forces alter the dough’s proteins, says Brian Plattner, a process engineer at Wenger Manufacturing. The heat comes from two sources: Some is generated mechanically by the spinning augers, and some is applied externally. For example, heat and moisture, in the form of steam, are often injected into the dough directly through the jackets surrounding the barrel.

The transformation to ersatz meat begins with the breaking of a certain type of chemical bond in the proteins, called a disulfide bond. These bonds link sulfur compounds in the protein molecules. It is largely the breaking and reassembling of those disulfide bonds that texturizes the vegetable protein to resemble meat, according to Fu-hung Hsieh, a professor of biological engineering at the University of Missouri. Hsieh developed the extrusion process—specifying the ingredients, temperatures, and so forth—on which Beyond Meat is based.

After their disulfide bonds are broken, the protein molecules are then realigned and linked into longer chains, based on new disulfide bonds, in a phenomenon similar to polymerization. Chemists refer to this mass linking as “plasticization,” which is why the technical name for the manufacturing process is “thermoplastic extrusion.”

For a high-moisture meat analogue, the final, or die, stage is all-important and requires a special die [PDF]. At 1 to 2 meters, it is longer than a conventional die, explains Mian N. Riaz, director of the Food Protein Research & Development Center at Texas A&M University. Coming from the barrel, the dough moves into the die at 125 to 130 °C, according to Plattner. The part of the dough that’s in direct contact with the die surface moves more slowly than the part in the center, giving rise to shear forces, which encourage the realignment of protein molecules. Meanwhile, coolant circulating through the die rapidly reduces the temperature of the meat-analogue-to-be. This shearing and fast cooling in the die chamber promotes lamination, in which adjacent long protein chains cross-link with each other. That cross-linking forms layers that greatly enhance the food’s fibrous texture. The cooling also prevents it from expanding and becoming puffed and porous, like a big meaty mega-Cheeto.

At the very end, just as it exits the die, the HMMA is shaped and cut into pieces. The relative proportions of moisture, fat, and protein in the final product can be set almost arbitrarily, but manufacturers generally make them close to those of lean meat: 60 to 70 percent moisture, 2 to 5 percent fat, and 10 to 15 percent protein. As with meat, the final product must be refrigerated or frozen to prevent spoilage.


Workers can flavor the HMMA while it is being made in the extruder or after it comes out of the die, or both. The Dutch product, Plenti, is sold unflavored and in bulk to retailers who flavor it and sell it to consumers.


The best modern extruder machines, with electronic sensors and actuators, offer precise control of the temperature, pressure, and shear forces, which all affect the characteristics of the HMMA. But the start-ups involved are, understandably, reluctant to give away their secret recipes. It seems they want to make as much progress as they can before they have to go head-to-head with the bigger players. Patrick Brown, the would-be mock-meat magnate, wrote in an e-mail: “Given the resources and political clout of the meat and dairy and Big Ag industries, I’m sure you can understand that we have more to lose than to gain by drawing further attention to our activities and plans.”


Still, details about some products have inevitably seeped out. Hsieh, whose breakthroughs led to Beyond Meat, was listed as coinventor on a recent U.S. patent that runs to 23 pages, including myriad details about temperature, pressure, pH, and lipid content. You can also find a video on YouTube that shows the “Missouri faux chicken” being made, in all its mock-meaty glory.


“Chicken” From an Extruder To make a high-moisture meat analogue, start by mixing plant protein powders with water and oil. Then knead the mix in an extruder barrel, and finally shear and cut in a die.


Given fake meat’s long history, why is it only now starting to rival the real thing? According to Gerry Hertzel, research coordinator for food products at Wenger, part of the answer can be found in improvements in the raw ingredients. The suitability, consistency, and variety of the soy and other vegetable protein powders from the Big Ag manufacturers, such as Archer Daniels Midland, Cargill, and CHS, has improved greatly, asserts Hertzel. “If you look at the soy isolates from 10 years ago and the ones today, there’s a night-and-day difference,” he says.


Hertzel’s colleague Plattner also points to recent improvements in the design of extruder dies that let operators very precisely control the rate of coolant flow, which in turn allows for consistent and uniform cooling. He believes that the pace of improvements will hold steady and bring even better products in coming years.


And extrusion isn’t the end of the line, technology-wise. At Wageningen UR, engineering professor Atze Jan van der Groot has invented a different kind of machine to process vegetable protein powders into a fibrous, meatlike product. Van der Groot began by studying the protein structure he wanted in the final product and then set about designing something that could reproduce it. The ingenious contraption that resulted uses a conical steel rotor that spins within a complementary funnel-shaped steel vessel. The rotor crushes, shears, and heats vegetable-protein powders. “We can create a wider range of textures,” van der Groot says, while demonstrating his novel food processor. “I think it will eventually be cheaper than extrusion. It’s quite simple and robust.” He is now working with engineers at Delft University of Technology, in the Netherlands, to scale up the system.

Today a lot of meat-analogue research in Europe is focused on foods other than soy. One of Germany’s renowned Fraunhofer Institutes is concentrating on lupine seeds and has produced sausages and “ice cream” that have received good reviews. In France, the food companies Sotexpro and Roquette have jointly introduced a pea-protein-based substance meant to be added to meat to extend it. Studies have indicated that meat can be supplemented by up to 30 percent with extenders before people begin to notice any difference. Already, in the United States, soy protein makes up about 30 percent by weight of the “meat” served in the National School Lunch Program. Vaclav Smil says that extenders are the surest way to start displacing meat on a large scale.


But the more important challenge now is to make a cheaper meat alternative, not a better one. Plenti and Beyond Meat generally cost about twice as much as real chicken. With greater economies of scale, the costs would undoubtedly go down.


Ethan Brown, the founder of Beyond Meat, wants his company to be huge, and not just for the obvious reasons. “If all we end up doing is serving the wealthy, we’d be upset,” he says. “Here’s how I’ll know when we’ve succeeded: when you can go into a Burger King or a McDonald’s and buy a plant-based nugget.”

This article originally appeared in print as “Muscling Out Meat.”


Adventures in Printing Food

3-D kitchen printers produce hits (a deep-fried scallop space shuttle) and misses (square milk)

IEEE Spectrum

By Jeffrey Lipton & Hod Lipson

Posted 31 May 2013 | 13:11 GMT

You want to bake a special cake for your mom, so you boot up the 3-D printer in your kitchen. Loaded with a dozen cartridges filled with pastes of chocolate, marzipan, and other ingredients, the machine downloads instructions from the Internet. You key in a specific texture, size, and flavor, and then you insert a 3-D message in the center—Happy Birthday, Mom!—to be revealed only after she takes a bite. The machine does the rest, assembling and baking a pastry so scrumptious it rivals a virtuoso chef’s in richness and complexity. Your mother loves it so much that she insists you send a slice of the same cake—in the form of a digital recipe—to your Uncle Norman’s printer as well. Your 3-D cake recipe gets so many raves, in fact, that you decide to sell the recipe online for 99 cents a download.


Science fiction? Hardly. The technology exists, and over the last eight years people have cooked up all sorts of comestibles with it, some a lot stranger than a cake with printing inside.

Let’s start with the printer. Versions of these machines, which follow an electronic blueprint to create 3-D objects out of layers of different material, have been around for nearly three decades. In the late 1980s, they were van-sized behemoths used in industrial settings for prototyping or for producing small batches of aerospace and medical parts. Today’s consumer models, by contrast, are about the size of a microwave oven and may sell for about US $1000. Between then and now, a whole community of do-it-yourselfers has emerged, eager to exploit the amazing capabilities of these versatile gadgets.

A high school student was among the first to use a Fab@Home 3-D printer to print edible objects, like these M-shaped chocolate pieces.

The Fab@Home project began in 2005 to help make the technology accessible to regular folks, and the following year came out with the first open-source DIY 3-D printer. Like most 3-D printers, the Fab@Home system uses a robotic arm with an extrusion head to deposit soft or liquid materials that later harden. While many home 3-D printers use spools of plastic as feedstock, our printer relies on syringelike cartridges that can hold a variety of different pastes and fluids. Researchers have used our machine to print artificial ears from living cells and to build working batteries and actuators from a variety of conductive and nonconductive materials. But what has really fired up the imaginations of DIYers is an application we never foresaw: printing food.

Noy Schaal, a high school student in Louisville, Ky., was one of the first people to use the Fab@Home at home rather than in a research lab. After getting the machine in 2006 she immediately modified it to work with her material of choice: chocolate. Getting the temperature just right took a while, but her printer ended up winning first prize in a local science fair, where Schaal printed chocolate letters, textured bars, and other shapes directly from a computer-aided-design (CAD) model and then handed them to the judges.

Other groups caught on to the printer’s culinary potential, and by the end of that year we at Fab@Home had begun experimenting, too. For starters, we printed hummus and peanut butter in every shape we could render on a CAD system. When the Fab@Home project won the 2007 Popular Mechanics Breakthrough award, the team printed out hors d’oeuvres, made of Brie and apricot comfiture, at the award reception. But our prized creation was—and still is—a space shuttle made from Cheez Whiz. For more than two years, it’s been sitting on a shelf in our lab, unrefrigerated. (Frankly, we’re afraid to eat it now.)

Those early attempts were all made from simple pastes that hardened when dried or cooled. But while a paste-based diet may have sufficed for the early astronauts, it’s too limited for most people. For digital cooking to really catch on, we concluded, the printers needed to accommodate a larger range of recipes, ingredients, and cooking temperatures.

Getting the printers to operate at the right temperatures for different types of food is not easy. Food, unlike plastic, can change dramatically over a relatively short period of time: A batch of frosting made in the morning may work fine at one temperature, but the same batch later in the day may not. Now consider the huge array of possible ingredients and the different settings that each would need, and you can see why creating a truly useful home food printer seemed at first impossible. Then Cornell University graduate student Daniel Cohen had an idea.

What was needed, he thought, was the equivalent of an RGB standard for food. RGB stands for red, green, and blue, the basic color elements used in televisions to reproduce a rainbow of colors; a similar set of basic colors—cyan, magenta, and yellow—are used in inkjet printers. Cohen’s idea was to create a similarly standard set of elements for the food printer that would make it simpler to produce a variety of foods—and also allow you to share your designs, so that you could “send” a piece of cake to your uncle’s printer.

With Cohen and undergraduate students from Cornell’s school of hotel administration, we began to look for these few printable ingredients that could be used to build many different food types. We didn’t have to look far. A huge industry already exists to devise food flavors and colors that can make just about anything look and taste like something else. Supplements like vitamins, minerals, and fibers are also widely available.

The only problem, then, was getting the right texture. For that we turned to hydrocolloids—materials like carrageenan, xanthan gum, and gum arabic—that today appear on many food labels. They’re the thickeners in McDonald’s milkshakes, for instance. We brought in other gelling agents like those used in Jell-O desserts. We were already familiar with some of these substances, having used them to help print living cells. This time, we mixed the gels and gumming agents with other ingredients and then put them through our printer to create edible constructs like cubes of milk, raspberry domes, and mushroom-shaped bananas.

While these recipes demonstrated Cohen’s principle, they were also a little too weird. Offer someone a plate of banana mushrooms and milk cubes and you’ve entered the uncanny valley of food, where nothing feels quite right and everything screams “artificial.” Most home cooks aren’t ready to go there just yet.

Some researchers do see a future, however, in digitally designing food from basic flavors and supplements, in large part because it could be a more efficient way to produce nutritious but otherwise expensive food. Researchers at TNO (the Netherlands Organisation for Applied Scientific Research), are extracting basic carbohydrates, proteins, and nutrients from algae, insects, and the like and then using them to print something resembling steak and chicken. Eventually, this may allow them to print a filet mignon from a protein that requires far less water, energy, and labor than does a cow. TNO isn’t the only place exploring this realm. Susana Soares at London South Bank University has used a flour made from crushed bugs to print edible objects that look like butterfly wings and honeycombs.

While this approach could someday solve the Malthusian concerns of food production, it’s a hard idea to swallow. The trend these days is to back away from highly processed foods. Last year a number of huge meat-packing operations were shuttered after news about “pink slime,” a processed meat paste added to ground beef, hit the media. Who would want to risk their business on a pink-slime machine, especially if that slime comes from bugs?


Instead of designing foods from basic materials—from the bottom up—we’ve recently turned toward a top-down approach. That is, we’re taking existing foods and modifying them to make them printable. The idea came to one of us (Lipton) while flying back to New York after presenting a talk. While flipping through the in-flight magazine, he learned that David Arnold, a world-class chef, wanted to get his hands on a 3-D printer. We quickly arranged to send him a Fab@Home printer. In the first 24 hours of our collaboration with Arnold, we made deep-fried scallops shaped like space shuttles and sculptures made out of turkey with celery centers. Using the printer to creatively customize food shapes, we discovered, is a lot more appealing than crafting milk cubes out of hydrocolloids.

Inspired by Arnold, we then set out to push food printing even further. After all, anyone can use a mold or cookie cutter to shape food, but only a 3-D printer can easily create internal designs and intricate sculptures. We started with a recipe we got from Franz Nigl, a visiting scientist from the Pacific Northwest National Laboratory, whose Austrian grandmother’s Christmas cookies were notable for holding their shape when baked. We made batch after batch of the dough, cramming it into the printer’s cartridges and fine-tuning the recipes and the machine. We then programmed the printer to etch a message onto the top of each cookie, and eventually, we created a cookie that had writing inside it.

But message or no, a cookie is still a cookie. Our next stop was the International Culinary Center in New York City, where in January 2011 we began experimenting with food that could be made only by using a 3-D printer and that would be unlike any food we had ever eaten before, yet similar enough to avoid the “ick” factor. The result? A new form of fried corn dough. Now in a world in which state fairs offer up deep-fried Twinkies, you would think that there would be few frontiers left for fried or corn-based foods. But the 3-D printer has opened up entirely new ways of modifying textures. By printing meandering streams of our corn dough, we created a porous matrix that allowed the frying oil to penetrate much deeper into the food. The result was something delicately crispy and greasy, like a cross between a doughnut, a tortilla chip, and raw ramen noodles.

Our food explorations continue. Digital cooking is still a nascent field, but we’re amazed at how much progress has already been made: From those humble peanut butter, hummus, and chocolate objects, it has already morphed into a movement that could someday transform how we prepare and consume food. While some people believe the future of printed food will begin at the chemical level, others think it will become a common tool to augment the molds, knives, and ovens we already have. Regardless, both camps agree that the information age’s transformations have started making kitchen magic.

And once you get started, it’s hard to stop. To reward ourselves for finishing this article, we went back into the lab and printed ourselves a couple of cookies. To do that we ran software that considered our scheduled activities for the day, our food intake, and our individual heights and weights, and we then programmed our food printer to layer both sugar-free and sugar-rich dough to create a cookie that had just the right number of calories to fill out our calorie deficits for the day.



About the Author

Jeffrey Lipton was the project lead of Fab@Home, one of the first fully open-source 3-D printers, which in 2006 helped moved 3-D printers into the consumer market in general and into food printing in particular. Lipton is now chief technology officer of Seraph Robotics, a company that makes Fab@Home printers and parts; he’s also a Ph.D. student in Cornell University’s Creative Machines Lab. Hod Lipson is a professor at Cornell and the Creative Machines Lab and is coauthor of the book Fabricated: The New World of 3D Printing (John Wiley & Sons, 2013). When not making or eating printed food, Lipton and Lipson have a passion for advanced manufacturing and for as-yet-unprintable gastronomic miracles like doughnut burgers.


NSA Won’t Jettison Contractors, Yet

Agency to Scrutinize Its Security Mechanisms

By Eric Chabrow, June 13, 2013


The director of the National Security Agency declined to say that the electronic spy agency would stop using contractors in top secret IT positions to prevent a leak such as the one that exposed NSA programs to collect metadata on U.S. citizens.

Gen. Keith Alexander also announced at a Senate hearing on June 12 that the Defense Department will examine the oversight mechanisms in place that are designed to prevent all individuals with top secret clearances, not just contractors, from accessing information they’re not entitled to see.

Responding to a question on whether the NSA should stop using contractors in sensitive positions, Gen. Keith Alexander said, “I’m not prepared to make that statement, yet.”

The question was raised by Sen. John Boozman, R-Ark., because Edward Snowden, a 29-year-old systems administrator with top secret clearance working for government contractor Booz Allen Hamilton, leaked details about two top secret programs to collect metadata about telephone calls and Internet activities in an attempt to identify terrorists “There are good contractors out there who are doing a good job,” Alexander said during the Senate Appropriations Committee hearing.

The Role of Contractors

The federal government relies heavily on contractors in many fields because it doesn’t have the expertise on staff to meet its needs; that’s especially true in information technology and information security. Contractors go through the same security clearance process as do federal employees, and they take the same oath not to disclose government and military secrets. “Their paychecks just come from two different sources,” Evan Lesser, managing director of the jobs website, tells Information Security Media Group.

Lesser points out that Army Pfc. Bradley Manning, on trial for disclosing one-quarter million sensitive and secret diplomatic cables to WikiLeaks, was a government employee, not a contractor. In vetting individuals for top secret security clearances, the government looks at their finances; foreign travel; and habits, such as drug and alcohol use and gambling, that could compromise them.

“From a clearance standpoint, there’s not a whole lot in their background that is going to likely trip them up,” Lesser says. “The younger you are, the easier it is to get a clearance because you have less baggage, less history. In the case of these two, it doesn’t sound like there was anything in their past that would raise a red flag during the clearance process.”

At the hearing, Alexander also dismissed Snowden’s contention that the contractor could tap into virtually any American’s phone calls or e-mail. “I know no way to do that,” he said.


Lengthy Investigation Expected

The NSA director also cautioned that it would take considerable time for the Defense Department to examine the security mechanisms in place designed to prevent individuals with top secret clearances from accessing information they’re not entitled to see. He said the investigation will look at procedures for when to encrypt data.

“I don’t want to mislead you; this is a significant effort for the Defense Department,” he said, adding that he has the backing of Defense Secretary Chuck Hagel and Joint Chiefs of Staff Chairman Gen. Martin Dempsey. “We’re pushing this; this is the right way to go. I wish we could go back in time.”

The federal government uses a variety of tools that could identify the activities of employees. Those include keylogging software and computer logs that pinpoint staff members’ whereabouts and actions within federal IT systems and networks, sources familiar with the federal government’s security clearance systems say. But having the tools in place – and not all tools are used by all agencies at all times – doesn’t mean that the proper authorities are alerted in a timely manner to activities that could jeopardize the nation’s security.

Alexander, who also serves as commander of the U.S. Cyber Command, said the leak has caused great harm. “The consequence of this is that our security has been jeopardized,” Alexander said. “There is no doubt in my mind that we will lose capabilities as a result of this, and not only the United States but those allies that we have helped will no longer be as safe as they were two weeks ago.”

‘Americans Will Die’

Alexander said the surveillance programs prevented dozens of possible terrorist events in the U.S. and abroad, but he declined to identify them for security reasons.

Because the programs have been disclosed, however, Alexander said it’s incumbent for political and military leaders to explain the benefits of the programs to the American people. And he said the NSA will be transparent, within limits, disclosing some information about the programs, but not everything. “Some of these are still going to be classified and should be because if we tell the terrorists every way we’re going to track them, they will get through and Americans will die.”

The End of the Military Industrial Complex

By DAVID FRANCIS, The Fiscal Times June 14, 2013


Defense Secretary Chuck Hagel told lawmakers Wednesday that the sequester, along with continued budget uncertainty, was making it difficult for the Pentagon to properly plan and prepare to confront 21st Century threats.

“We are now in a different fiscal environment dealing with new realities that will force us to more fully confront these tough and painful choices, and to make the reforms we need to put this Department on a path to sustain our military strength,” Hagel told the Senate budget committee. “The onset of these resource constraints has already led to significant and ongoing belt-tightening in military modernization, force structure, personnel costs, and overhead expenditures.”

Hagel said the Pentagon would be forced to make cuts to personnel benefits, reduce force size, and delay training of new recruits. But some of the most dramatic cuts are expected to occur in the acquisitions budget, which pays contractors for the machinery they produce and the services they provide.

Earlier in the year, there was concern that the $43.2 billion removed from DOD’s budget this year combined with an additional $600 billion over the next decade, would hamstring the broader economic recovery. As the chart below shows, DOD spending accounts for nearly 6 percent of gross domestic product.

Hubert van Tuyll, a professor of economics at Georgia Regents University and co-author of Castles, Battles, and Bombs: How Economics Explains Military History, said he was not concerned about DOD dragging down the broader economy.

“When you have a big defense drawdown, it doesn’t really hurt the overall economy that much. But when you have a drawdown, the military establishment has to make a decision to lose people or equipment,” van Tuyll told The Fiscal Times. “It’s the equipment that will go first.”

This is bad news for defense contractors. In the coming years, the massive post-9/11 military industrial complex will be forced to confront a series of challenges, from the drawdown to a poorly defined mission, that will eat away at their bottom lines. This will force defense contractors to make tough decisions regarding their futures their research and development budgets shrivel. 

Large and small defense contractors made money hand-over-fist after the September 2001 terrorist attacks. The chart below shows just how dramatic their revenue growth has been.

This sharp increase has led to massive profits. In 2002, the combined profits of the five largest U.S.-based DOD contractors – Lockheed Martin, Northrop Grumman, Boeing, Raytheon and General Dynamics – totaled an inflation-adjusted $2.4 billion. Ten years later, revenues for these five companies jumped 450 percent to $13.4 billion.

According to retired Army Colonel Doug Macgregor, these enormous profits are quickly becoming a thing of the past. “The traditional way of doing business … is not going to last. It’s not going to last because we can’t afford it.

The Pentagon has yet to specially outline how it will make funding decisions. According to van Tuyll, history illustrates how the military makes tough spending choices.  During the Great Depression, the Army was forced to cut costs that rose during the First World War.

“The U.S. army was faced with a similar problem to what they’re facing now,” he said. “Douglas MacArthur and George C. Marshall decided it takes much longer to train new officers than it does to make an airplane, a tank or a gun. So they cut equipment”

This strategy paid off during World War II. The officers needed to quickly train a fighting force capable of defeating Hitler’s war machine was already in place. The subsequent manufacturing push helped to end the Great Depression.

The contractors are also working against a shift from a large war model to one that uses smaller forces and drones to conduct security operations, Macgregor said. This works against contractors’ preferred way of doing business: big orders paid for by big contracts for big items like planes, tanks, and weapons.

In the coming years, Macgregor said that DOD is likely to give out small initial contracts first. If the Pentagon likes the result, it will order more. If not, it won’t pay a second contract.

“Contractors don’t like the idea of creating smaller numbers before big orders because they’re not going to get the kind of profits you did in the past. Everyone got rich off the old strategy,” he said.

Macgregor added that DOD’s failure to articulate how it plans to drawdown equipment allows these companies to continue to abuse the industrial age model. As long as the military keeps ordering unnecessary vehicles like the Abrams tank, contractors are making money by exploiting an outdated war model.

“Industry will begin to do what is has always done,” he said. “Everyone is going to squeeze out profits from whatever they got before they can’t squeeze any more.”

It’s not just the lack of large orders that will take a toll on contractors. Research and development dollars coming out of the Pentagon are likely to dry up as well. These grants allow companies to develop dual-use technology that produces two independent revenue streams. For instance, the Pentagon could pay Boeing to develop an engine for a cargo plane that could also be sold to a commercial airline.

“The American aviation industry got a head start after World War II through what was essentially a military subsidy,” Georgia Regent’s van Tuyll said. “If you learn how to build a good military engine, you can stick it on a commercial airplane.”

For now, defense stocks of major defense companies hold steady. But Macgregor believes all of these factors will eventually lead to job losses once the real bloodletting begins.

Jurgen Brauer, professor of economic at Georgia Regents University and co-author of Castles, Battles, and Bombs, said these companies needed to reinvent themselves. A prime example how a defense contractor can evolve is General Electric. The company made enormous profits and technological advances during World War II. Once the war was won, it began to invest in and acquire other businesses. Now it offers everything from financial management to refrigerators along with military jet engines.

“Last time around, after the end of the Cold War, we saw a huge consolidation (mergers & acquisition) among the major contractors,” Brauer, who is teaching overseas, wrote in an email to The Fiscal Times. “Unlike Europe, there isn’t much more to consolidate among the U.S. ‘majors’ so they may seek even more overseas contracts or, for those with a civilian market portfolio, are likely to shift more deliberately and clearly into nonmilitary work.”



June 14, 2013

Snowden’s Leaks on China Could Affect Its Role in His Fate




HONG KONG — The decision by a former National Security Agency contractor to divulge classified data about the U.S. government’s surveillance of computers in mainland China and Hong Kong has complicated his legal position, but may also make China’s security apparatus more interested in helping him stay here, law and security experts said on Friday.

The South China Morning Post, a local newspaper, reported on Friday that Edward J. Snowden, the contractor, had shared detailed data showing the dates and Internet Protocol addresses of specific computers in mainland China and Hong Kong that the National Security Agency penetrated over the last four years. The data also showed whether the agency was still breaking into these computers, the success rates for hacking and other operational information.

Mr. Snowden told the newspaper that the computers were in the civilian sector. But Western experts have long said that the dividing line between the civilian sector and the government is very blurry in China. State-owned or state-controlled enterprises still control much of the economy, and virtually all are run by Communist Party cadres who tend to rotate back and forth between government and corporate jobs every few years as part of elaborate career development procedures.

Kevin Egan, a former prosecutor here who has represented people fighting extradition to the United States, said that Mr. Snowden’s latest disclosures would make it harder for him to fight an expected request by the United States for him to be turned over to American law enforcement. “He’s digging his own grave with a very large spade,” he said.

But a person with longstanding ties to mainland Chinese military and intelligence agencies said that Mr. Snowden’s latest disclosures showed that he and his accumulated documents could be valuable to China, particularly if Mr. Snowden chooses to cooperate with mainland authorities.

“The idea is very tempting, but how do you do that, unless he defects,” said the person, who insisted on anonymity because of the diplomatic sensitivities in the case. “It all depends on his attitude.”

The person declined to comment on whether Chinese intelligence agencies would obtain copies of all of Mr. Snowden’s computer files anyway if he is arrested by the Hong Kong police pursuant to a warrant from the United States, where the Justice Department has already been reviewing possible charges against him.

A Hong Kong Police Force spokeswoman said earlier this week that any arrest would have to be carried out by the Hong Kong police and not by foreign law enforcement. The Hong Kong police have a responsibility to share with mainland China anything of intelligence value that they find during raids or seizures of evidence, according to law enforcement experts.


Patricia Ho, a lawyer who specializes in political asylum at Daly and Associates, a Hong Kong law firm, said that if Beijing decides that it wants Mr. Snowden to stay in Hong Kong for a long time, the simplest way to do so would be for mainland officials to quietly tell Hong Kong’s government officials not to hurry the legal process.


The United States and China have long accused each other of monitoring each other’s computer networks for national security reasons. The United States has also accused China of hacking to harvest technological secrets and commercial data on a broad scale from American companies and transferring that information to Chinese companies to give them a competitive advantage.


Tom Billington, an independent cybersecurity specialist in Washington, said that mainland China could benefit by obtaining a copy of the data that Mr. Snowden gave to the South China Morning Post. The data, if independently verified, could help Chinese officials figure out which computers have been hacked, patch security holes, itemize compromised data, analyze the quality of computer security defenses and develop techniques for hardening other Chinese computers against future surveillance by the N.S.A.


According to The Guardian newspaper of Britain, Mr. Snowden showed up with four laptop computers for a meeting with its journalists in Hong Kong. But the Los Angeles Times has reported that Mr. Snowden originally smuggled electronic files out of the National Security Agency in Hawaii using a USB thumb drive.


Simon Young, the director of the Centre for Comparative and Public Law at the University of Hong Kong, said in a statement that it would be a violation of Hong Kong law to disclose any information that had been shared confidentially by the Hong Kong or mainland Chinese governments with the United States.


“These recent developments underline the importance of Mr. Snowden obtaining immediate legal advice in Hong Kong, especially before any further disclosures are made,” Mr. Young said.


Mr. Young did not suggest whether any of the data shared by Mr. Snowden would fall into this category. But the Hong Kong government has a history of close law enforcement cooperation with the United States, particularly in the area of counterterrorism. The Hong Kong police have long focused on trying to prevent the territory’s freewheeling financial system from becoming a base for Al Qaeda-related money laundering.


The South China Morning Post said that one target of N.S.A. hacking identified by Mr. Snowden was the Chinese University of Hong Kong, which hosts the city’s main hub for Internet connections to the rest of the world. “The University has not detected any form of hacking to the network, which has been running normally,” the university said in a statement.


The newspaper said that it had not independently verified the accuracy of the data that Mr. Snowden provided. But the United States government has not questioned the authenticity of any of the documents he has released.


The Global Times, a nationalistic mainland Chinese newspaper under the direct control of the Communist Party, published an editorial on Friday calling for China to glean as much information as possible from Mr. Snowden.

“Snowden is a ‘card’ that China never expected,” the commentary said. “But China is neither adept at nor used to playing it.”

The commentary also called for China and Hong Kong to treat Mr. Snowden kindly enough so that others with national security secrets will not be discouraged from fleeing here. “China should make sure that Hong Kong is not the last place where other ‘Snowdens’ want to go,” it said.

The Associated Press reported on Friday that Britain had issued an alert to airlines around the world warning them not to bring Mr. Snowden to its soil, and threatening them with a fine of 2,000 pounds, or $3,125. Geoffrey Robertson, of London, who was an initial lawyer for Julian Assange during the WikiLeaks dispute, criticized the alert as unusual because it was being applied to someone who has denounced government policies.

“This is a power hitherto used only against those who incite terrorism, race hatred and homophobia — never before against whistle-blowers,” Mr. Robertson wrote in an e-mail. “The British government is simply afraid that its judges, who are fiercely independent, and the European court would embarrass its closest ally by ruling that Snowden could not be extradited because, even if his “revelations” prove to be mistaken, he would be subjected to oppressive treatment akin to that being meted out to Bradley Manning,” the American Army private accused of having leaked secrets in the WikiLeaks case.


Rasmussen Reports

What They Told Us: Reviewing Last Week’s Key Polls

Bottom of Form

Saturday, June 15, 2013


The news that the National Security Agency is monitoring everybody’s phone records and reading emails dominated the news last week. Most voters (57%) fear other government agencies will use the NSA data to harass political opponents. Just 26% support the government monitoring effort

President Obama, whose deeply held faith in government is unwavering, dismissed the concerns as “hype.” He added, “If people can’t trust not only the executive branch but also don’t trust Congress and don’t trust federal judges to make sure that we’re abiding by the Constitution with due process and rule of law, then we’re going to have some problems here.”

We have a problem.

Just 30 percent of voters nationwide have that much trust in government officials when it comes to these surveillance efforts.

Only 24 percent now are confident that the federal government does the right thing most of the time.

Scott Rasmussen’s weekly newspaper column suggests that this popular distrust of government is the theme that ties all the recent news stories together. “It’s a driving force in the current policy debates over immigration, gun control, health care and more.” Scott adds, “Many in Washington are frustrated by the public distrust. They dream of public relations programs to overcome it. What is needed, though, is for the government to change its behavior, so that it can earn the trust of the people it serves.”

Right now, though, there are enough controversies to rank them in order. The NSA spying is seen as the top concern by 35% while 22% say the same about the IRS targeting of conservative political groups.  Lower on the list are questions about Benghazi and the Justice Department’s snooping of reporters. Among those following the story most closely, the IRS scandal is seen as the most serious.

The distrust factor is playing a huge role in the debate over immigration reform. While most voters support the concept of immigration reform, they do not trust the government to honor and enforce the border security provisions of any new law. That’s why most demand that border security comes first.

On health care, the president’s law is progressing along an uncertain path to implementation. However, just 39% of voters have a favorable opinion of it. The distrust factor is on display here as well. Despite government promises that the plan will reduce the cost of care, just 13% believe it will actually lead to lower costs.  Seventy-five percent (75%) think the health care law is likely to cost more than official estimates and 66% believe it will increase the federal budget deficit.

Only 19% believe the new law will improve the quality of care. Forty-eight percent (48%) believe it will hurt. With these concerns in mind, most Americans want the option of less health insurance coverage and more take home pay.  “In that environment, employers will compete to find the best mix of pay and benefits needed to recruit good employees,” Scott Rasmussen said in a recent newspaper column. “As consumers opt for less coverage and more take-home pay, they will effectively repeal a major portion of the president’s health care law.”

Similarly with gun control, Americans overwhelmingly like the idea of requiring background checks for those who want to purchase a gun, but they are very suspicious of where the president and New York Mayor Michael Bloomberg want to go from there. If voters were convinced their plan was for background checks and nothing more, it would have enjoyed broad popular support.

Voters still have more faith in state and local government than they do in the federal government. Not surprisingly, they want to see a number of issues currently handled by the feds shifted to government closer to home.

But, while trusted more, there remain concerns about overreach of local governments, too. In New York, city officials have filed discrimination complaints against a deli owner who requires their customers to dress modestly. Sixty-eight percent (68%) of Americans think business owners should be allowed to have such a modesty requirement. Just 22% disagree.

The Obama administration now says the Syrian government has crossed a red line by using chemical weapons on its population. As a result, the administration will provide military assistance to Syrian rebels. Earlier polling showed that, even with proof of a chemical weapons attack, just 31% of voters think the United States should provide military assistance to protect the Syrian rebels.

Deputy National Security Adviser for Strategic Communication Ben Rhodes said “Any future action we take will be consistent with our national interest, and must advance our objectives.” However, only 23% of all voters regard Syria as a vital national security interest of the United States. Rhodes said the U.S. interests “include achieving a negotiated political settlement to establish an authority that can provide basic stability and administer state institutions; protecting the rights of all Syrians; securing unconventional and advanced conventional weapons; and countering terrorist activity.”

In other news this week:





From → Uncategorized

Comments are closed.

%d bloggers like this: