Skip to content

August 3 2013




White House Employees’ Personal Email Hacked

By Aliya Sternstein

July 29, 2013 7:38 AM ET


Three White House staffers have had their personal Gmail accounts breached in what appears to be a malicious operation directed at the team responsible for the Obama administration’s social media outreach, according to individuals familiar with the incident. 

The penetrated accounts have been sending other White House digital media employees bogus emails containing fraudulent links that can extract their personal email logins and Twitter credentials. More than a dozen current and former staffers were targeted, the individuals said. The scheme was ongoing as of Sunday night. 

The goal of the intruders might be to glean sensitive government information, some cyber researchers said. White House personnel are prohibited by law from using personal Webmail accounts for business communications, but not all employees comply with the rules. The Twitter scam could be aimed at spreading misinformation through seemingly-official channels to citizens.

The “phishing” links — labeled to look like legitimate BBC or CNN articles — direct users to an authentic-looking Gmail or Twitter login screen to access the news content. At this point, the users have unwittingly been rerouted to fake login forms that enable hackers to capture their sign-on information. 

White House social media employees might be relatively easy game within the administration, since their role is to make the executive branch more open to the public. “I imagine that the names and email addresses of people at the White House in digital media or anything related to media are easy to find since their job involves public access. A list of targets would be created from open sources and that’s who the phishing email would be delivered to,” said Jeffrey Carr, a cybersecurity analyst with consultancy Taia Global.

The objective for harvesting Gmail account information might be to capture administration-related email messages and contacts, he speculated.

The Presidential Records Act bars work communication outside of official email accounts. However, a 2012 House committee report showed that former White House Deputy Chief of Staff Jim Messina used his personal email account to conduct official business involving a deal between the pharmaceutical industry and the Senate Finance Committee. And in 2010, the Washington Post reported that administration officials reprimanded then White House Deputy Chief Technology Officer Andrew McLaughlin, a former Google official, after document requests revealed technology policy-related emails from Google employees in his personal Gmail account.

The purpose of assembling Twitter sign-on information might be to disseminate disruptive messages, Carr postulated. This spring, a hacked Associated Press Twitter account informed the public that explosions at the White House had harmed the president. The Dow tumbled in response. 

Sources familiar with the Gmail hack say the ploy is unique in the White House. In the past, one or two staffers who used two-step authentication to protect their Gmail accounts would receive text messages, indicating someone had entered the correct password to trigger the text authentication code. 



Air Force Asks Students to Solve Real-World Problems

NY Times


Published: July 28, 2013


THE Air Force, as part of its recruitment efforts, is approaching young people for help in solving real-world technological problems using a collaborative online platform.

The initiative, which will be introduced on Thursday, will create a digital program called the Air Force Collaboratory, in which young people will be challenged to develop technologies for search-and-rescue operations in collapsed structures; to create software code for a quadrotor, a type of unmanned, aerial vehicle; and to determine where to place the newest GPS satellite.

The Air Force hopes the program will attract students in so-called STEM subjects — science, technology, engineering and mathematics — to work with its airmen on developing solutions for the three challenges, and, ideally, consider enlisting.

The initiative — which the Air Force will promote through digital advertising, social media and partnerships with groups like Discovery Education — is the latest recruiting effort created for the Air Force by GSD&M, an agency based in Austin, Tex., that is part of the Omnicom Group.

GSD&M has been the Air Force’s agency since 2001, developing campaigns to help it attract the over 28,000 recruits it needs annually; the agency said its work had helped the Air Force meet its recruiting goals each year.

GSD&M’s recruiting strategy for the Air Force — which has always sought tech-savvy candidates — previously featured an “Airman Challenge” online video game. A separate campaign included television spots whose theme was, “It’s not science fiction.”

Col. Marcus Johnson, chief of the strategic marketing division of the Air Force Recruiting Service, said the Air Force focused on “going after the best and brightest young men and women, with an emphasis on the STEM subjects. Whether they’re in high school or college, those topics translate into what we do in the Air Force.”

He said the collaboratory program was meant to appeal to men and women ages 16 to 24, including high school students still determining their future plans.

Ryan Carroll, a creative director at GSD&M, said the Air Force was “very much like the Apples and Googles of the world in recognizing the huge need for scientists and engineers. They reach out to kids at an early age and show them the amazing things they can do with science and technology.” He pointed to initiatives like the Google Science Fair, an online, annual, global science competition for teenagers, as an example.

Similarly, the collaboratory program aims to “inspire the next generation of scientists, engineers, technologists and mathematicians, and to show them all the amazing, science-related things the Air Force does,” Mr. Carroll said. The program will also allow students to “participate and solve real problems the Air Force solves every day,” he added.

Young people will be able to learn more about the initiative’s challenges at the Web site, which will act as a forum. Challenge participants will be able to use custom-built tools to share ideas and work with airmen and other experts to develop solutions.

Not surprisingly, digital media will primarily be used to promote the program. Custom editorial content is being developed for the STEM hub of, a global community of “pragmatic idealists,” while custom videos are being filmed for DNews, an online video series from Discovery Communications; the videos will feature the DNews hosts Trace Dominguez and Anthony Carboni. The technology network Technorati is asking bloggers to create custom posts on the collaboratory and related subjects, while the Air Force will pay to place videos on Web sites like YouTube, Blip and Machinima. In addition, the Air Force will promote the initiative on Facebook and Twitter.

Digital banner advertising will run on the Web sites of Scientific American, Popular Science and The Verge. One set of ads depicts an Air Force helicopter approaching a scene of destruction after a 7.0-magnitude earthquake that has trapped dozens of survivors. The copy reads, “Your idea could save them. The Air Force Collaboratory. Search and rescue 2.0 is now open. Start collaborating.”

The Air Force also is working with Discovery Education, a division of Discovery Communications, on an outreach program for high school science and math teachers.

Colonel Johnson said that although the collaboratory would run through November, new challenges could be created after that. In addition, he said the Web site would carry no overt recruiting messages, nor would the Air Force actively recruit challenge participants, since the initiative was meant to raise interest in the Air Force and possibly encourage participants to seek out more information about opportunities there.

The budget for the campaign is $3.7 million.

Diane H. Mazur, a former Air Force officer, professor emeritus of law at the University of Florida and author of “A More Perfect Military: How the Constitution Can Make Our Military Stronger,” said that although the collaboratory concept was “good, it’s not sophisticated to the degree it needs to be to attract the people they think they want to get.” She added, “This is a good direction if you do it well.”

David R. Segal, a professor of sociology at the University of Maryland who specializes in military sociology, said that while recruiting high school students to “work in military laboratories on military problems” was not new, “what seems new is having interns work online with Air Force scientists.”

“I think they will certainly recruit a good number of high school students interested in science, engineering, technology and math to work on the problems identified. That part is easy,” he said. “Recruiting the same people then to come into the Air Force as enlisted men and women might be more difficult. They are likely to want to go to college.”

As a result, he said, the collaboratory would probably be more successful recruiting Air Force Reserve Officers Training Corps students than airmen.


Small business contracting numbers inflated by errors and exclusions, data show

Washington Post

By J.D. Harrison, Published: July 28 | Updated: Monday, July 29, 5:00 AM


The federal government is required by law to try to direct nearly a quarter of all contracting dollars to small businesses, and every year since 2005, officials have reported missing the goal by the slimmest of margins.

Then again, it depends on who is counting.

A number of contractors and advocacy groups say the government has repeatedly inflated the share of contracting dollars awarded annually to small firms, masking serious problems in the procurement process that prevent small businesses from securing more government work.

In 1958, when Congress created the Small Business Administration, it tasked the agency with establishing an annual small-business contracting goal of “not less than 23 percent of the total value of all prime contract awards.”

This month, for the seventh year in a row, SBA officials reported that the government narrowly missed the goal, reporting that small firms received 22.25 percent (or $89.9 billion) of contracting dollars in fiscal year 2012 — better than 21.65 percent last year, but down from 22.7 percent in 2010.

In a blog post announcing the report, John Shoraka the agency’s associate administrator for government contracting, called the achievement “real progress” toward the goal.

However, the SBA’s calculations come with several caveats, in large part because the agency excludes certain contracts and entire agencies from its measure.

Officials do not take into consideration, for instance, any contract work for the Federal Aviation Administration, the Transportation Security Administration or the Central Intelligence Agency, nor do they account for any contracts for goods sold overseas or any work performed outside the United States.

In all, SBA officials have determined that about one-fifth of all federal contract spending is not “small-business eligible,” and so it excludes that portion from its calculations.

The portion includes spending by agencies that are not subject to certain federal acquisition regulations, and those that do not report into the Federal Procurement Data System, from which the SBA pulls its data, Skoraka said. Other exclusions have been made on the basis that those contracts do not lend themselves to competition by small firms.

Shoraka noted that the current list of exclusions was finalized during the second term of former president George W. Bush. The Obama administration elected to leave them in place in order to “compare apples to apples,” he said.

Critics argue that is not what Congress mandated.

“They are simply not following the letter of the law,” said Charles Tiefer, a professor of government contracting at the University of Baltimore Law School. “It states 23 percent of all contracts, and there is no reason to think Congress wanted some of these exclusions.”

Contracts out of reach for small businesses, he said, should be considered as part of the 77 percent of government spending available to large and international companies, rather than removed from the equation altogether. He pointed out that some of the excluded contracts, including intelligence gathering and work overseas, are areas in which government spending has surged in recent years.

The SBA’s Office of Inspector General has also urged the agency to discontinue some of its exclusions, particularly for contracts performed overseas. In an advisory memorandum from December 2011, the office cited a 2008 legal opinion issued by the SBA Office of General Counsel, which states it would be “a reasonable interpretation” of the law to assume the targets include contracts performed outside the country.

Congress has recently taken issue with the exclusions, too. In the 2013 National Defense Authorization Act, lawmakers ordered the SBA administrator to review the goal guidelines to ensure that the process “does not exclude categories of contracts” based on the types of goods or services solicited or, in some cases, whether the agency is subject to federal acquisition regulations.


The nuances of the government’s measurements can sometimes get lost in public discussions about contracting.

In Shoraka’s blog post announcing the government’s performance, he wrote that 22.25 percent represented the small business share of “all” federal contracting dollars last year.

After On Small Business asked about the language, given the exclusions to the calculations, officials updated the blog to read 22.25 percent “of all small business eligible contracts.”

Two studies show small-biz getting 19% of contracts

To get a sense of what effect the exclusions have on the numbers reported, On Small Business asked Fedmine, the data analysis firm that conducts the contracting calculations for the SBA, to crunch the numbers based on total federal contract spending reported into the FPDS, without any exclusions.

The revised calculations show that small businesses received less than 19 percent of all prime contracting dollars in 2012. In contrast to the progress cited by the SBA, that was actually down from the year before (20 percent).

The House Small Business Committee conducted its own analysis of last year’s federal data, eliminating many of the SBA’s exclusions. The committee also found that the small-business share of total federal contracting was around 19 percent.

“The administration shouldn’t be allowed to cook the books,” Committee Chairman Sam Graves (R-Mo.) said in a statement earlier this month.

Small-business advocates say the agency’s exclusions are not the only source of padding in the small-business contracting numbers. Of greater concern, they say, is the number of contracts labeled by the government as “small business” awards that actually go to large companies — a problem that has long plagued the federal government.

During his run for the presidency in 2008, then-Sen. Barack Obama emphasized small business, at one point saying that “it is time to end the diversion of federal small-business contracts to corporate giants.”

Nearly five years later, in her most recent management report, SBA Inspector General Peggy Gustafson said the agency’s top challenge is still that “procurement flaws allow large firms to obtain small-business awards and agencies to count contracts performed by large firms towards their small-business goals.”

Inspector general “audits and other governmental studies have shown widespread misreporting by procuring agencies since many contract awards that were reported as having gone to small firms have actually been performed by larger companies,” Gustafson wrote. “Most of the incorrect reporting results from errors made by government contracting personnel, including misapplication of small-business contracting rules.”

Shoraka says the agency has taken steps to stop the errant reporting. A few years ago, for example, officials began running a computerized “anomaly” process to identify red flags in the federal data system.

The program searches for conflicting reports, missing fields of information, and the names of Fortune 100 firms that were awarded small-business contracts. When potential errors are flagged, contracting agencies are asked to take a second look at their reports and fix any mistakes.

Some say the process does not appear to be working.

The American Small Business League, an advocacy group, combed through the Fedmine data following the SBA’s report earlier this month. In its analysis, the group found that more than half (57) of Fortune 100 companies or their subsidiaries won awards that were labeled in the federal data system as small-business contracts, including industry giants such as General Electric, Apple and Citigroup (the latter two declined to comment).

In one instance, General Dynamics, a defense contractor with roughly 80,000 employees based in Falls Church, received more than $230 million in small-business contracts in 2012 and roughly $2 billion in the five years prior, according to data from Fedmine.

In total, the largest 100 corporations in the country received nearly half a billion dollars in small-business contracts last year, according to ASBL.

General Dynamics spokesman Rob Doolittle directed attention to the SBA OIG report, which suggested most size-classification errors are the result of mistakes by federal contracting officials. He also noted that small firms acquired by a large company during the life of a contract are permitted to keep those contracts. However, the business is not required to ensure that the contract categorization is updated in the federal data system.

Sebastien Duchamp, a spokesman for General Electric, said the federal database sometimes erroneously shows the company as a small business, adding that the firm regularly reviews the data for errors and alerts contracting officers when necessary.

While Inspector General Gustafson suggests most of the problem stems from those types of reporting errors, some of it boils down to fraud.

In March, the chief executive of Arlington-based security contractor PSI pleaded guilty to major government fraud for allegedly operating a shell company, SAC, that shuffled more than $31 million in small-business set-asides to his much-larger company. Keith Hedman, the executive, was sentenced to six years in prison, and last month, the employee he tapped to run the front company was sentenced to four years.

One of the small-business set-aside contracts Hedman’s shell company won was for security services at Walter Reed Medical Center, edging out a competing bid by Davis-Paige Management Systems, a small service-disabled veteran-owned business in Annandale. Micheal Davis, the company’s chief executive, said his company spent around $100,000 chasing the contract and stood to gain around $11 million in revenue by winning.

When his firm lost to SAC, Davis said he was forced to let several employees go and move several others to part time.

“It takes work away from companies like ours that took the time to get the proper certifications to compete for these contracts,” Davis said in an interview, adding that the SAC case makes him wonder how many other “small” contractors are actually small.

An SBA OIG report to Congress last year documented several other examples of large firms that have been prosecuted for masquerading as small businesses to win contracts.

“It isn’t miscoding, it isn’t computer errors, it isn’t anomalies,” American Small Business League President Lloyd Chapman said. “These numbers are being inflated and misrepresented.”


Rules aren’t being followed

The elevated small-business contracting numbers help conceal a number of systemic problems in the federal procurement process, according to a former head of small-business contracting at the Defense Department.

“The real problems meeting these goals are tactical ones, down at the operations level, where contracts are being written and awarded,” said Daniel Gill, who headed the agency’s Office of Small Business Development under President Bill Clinton.


Gill, who now consults with government services firms and recently taught courses at the Defense Department’s acquisition training school for contracting officers, argued that the government does not need new regulations to meet its small-business contracting mandate. Instead, he said the goal would be “a piece of cake” if contracting officials in each agency simply adhered to existing procurement protocols.

“A lot of contracts are going to large business that should be going to small businesses, and it’s not just a matter of large businesses miscategorizing themselves,” Gill said. “It’s often that the proper set-aside determinations are not being made to reserve small contracts for small businesses.”

The most common example, he said, concerns long-standing regulations that require agencies to reserve all contracts worth between $3,000 and $150,000 for certified small businesses, unless the agency cannot identify two small businesses that can provide the product or service at a fair market price.

In the past few years, the Government Accountability Office has identified numerous instances in which federal officials either never did the market research to determine if small businesses were available to meet a contract’s requirements, or conducted the research, but failed to set aside those contracts for small businesses.

Phoenix Environmental Design, a small service-disabled veteran-owned firm in Plankinton, S.D., that provides pesticides and herbicides to the federal government, has filed more than 30 protests to the GAO in the past two years against agencies for faulty contracting practices. About half of them concerned contracts intended for small businesses that went to large corporations.

In every instance, the government has pulled back the award and solicited bids from small firms, according to the company’s owner, Chad Gill. What’s more, government documents show that the small firms that won the contacts the second time around routinely did so with a lower bid than the initial award to the large company.

“When we got them to do it right, and there is competition and accountability, it ends up costing the federal agency less money,” Gill said.

The problem, he said, is that many of the contracting officers he works with in various agencies do not understand the small-business set-aside process. He is not the only one who has made that observation.

In one of the GAO’s rulings last fall, General Counsel Lynn H. Gibson noted that the Veterans Administration, for instance, has repeatedly failed to set aside contracts reserved for small firms, later suggesting that contracting officers have demonstrated “a fundamental misunderstanding of the agency’s obligations” under contracting laws.

Charles Baker, who owns an electrical company that services the Defense Department, said his firm has suffered from similar contract classification errors. In many cases, he said, contracts that fall into the $3,000-to-$150,000 range are offered for general solicitation rather than reserved for small firms.

“The system is fundamentally broken, and it can destroy a small company like mine,” Baker, who owns MCB Lighting & Electric in Owings, Md., said. “There is no compliance with the laws, no enforcement.”

Maureen Schumann, a spokeswoman for the Defense Department, said Baker’s comment “clearly illustrates some of the frustrations felt by our industrial base” and that the agency is taking measures to “ensure that the right policies, procedures and programs are in place to increase contracting opportunities for small businesses.”

“We are constantly analyzing data and the [department] is using every available regulation to identify specific contracts that can be set aside for small businesses” Schumann said.


She noted that the department contracted with small businesses on 68 percent of contracts in the $3,000-to-$150,000 range last year, an increase from 2011.


Funding schemes in Congress could ground drones; FAA pressured over privacy

Washington Times

Ben Wolfgang

July 28, 2013

The lagging federal effort to fully integrate drones into U.S. airspace is in danger of falling even further behind schedule.

A funding bill now before the Senate essentially would stop the process in its tracks by prohibiting the Federal Aviation Administration from moving forward until it completes a detailed report on drones’ potential privacy impact.

The report, called for in the Senate’s fiscal 2014 transportation appropriations measure, would be yet another hurdle in the FAA’s already complex, time-consuming drone integration initiative.

The agency has been charged by Congress to write rules and regulations allowing drones — now used primarily by the military, law enforcement and researchers — to operate commercially in U.S. skies by September 2015, but the industry fears that deadline is likely to be missed.

Requiring the FAA, which traditionally deals only with airspace safety and has little experience in writing Fourth Amendment protections, to craft a comprehensive privacy report would all but guarantee the date will be pushed back.

Leaders in the unmanned aerial systems sector warn that such setbacks will hamper American technological innovation and carry economic consequences.

“Privacy is an important issue, and one that deserves to be considered carefully. But further restrictions on FAA integration will only set back important progress,” said Ben Gielow, government relations manager with the Association for Unmanned Vehicle Systems International (AUVSI), the drone sector’s leading trade group.

“If we are not able to keep the integration on track, the U.S. could lose out on creating tens of thousands of jobs and undermine the growth of a new industry at a time when we need it most,” he said.

The Senate bill doesn’t explicitly call for the FAA to stop drone integration efforts, but it would establish a de facto moratorium by cutting off funding for the process.

A section of the legislation, put forward by Sen. Patty Murray, Washington Democrat, states that “none of the funds in this act may be used to issue regulations on the integration of unmanned aerial systems into the national airspace” until the privacy report is completed and presented to the House and Senate appropriations committees.

The Senate Appropriations Committee directed questions on the bill to Ms. Murray, who is chairwoman of the panel’s subcommittee on transportation. Her office did not return emails or calls seeking comment.

The House’s transportation funding bill does not include such language, and the Senate provision could be changed or dropped entirely in the coming months.

For now, however, the bill underscores the deep fear in Congress and among the American public that widespread drone use will be a serious blow to personal privacy.

Sen. Dianne Feinstein, California Democrat, said last month that she considers drones to be “the greatest threat to the privacy of Americans.”

Coming from Ms. Feinstein, chairwoman of the Senate Select Committee on Intelligence, those words carry extra weight. She is intimately familiar with classified details of the National Security Agency’s data-collection programs and other efforts that, critics say, erode Americans’ Fourth Amendment rights.

Many other members of Congress, civil liberties groups, privacy advocates and others have said drones — increasingly small, undetectable and able to be equipped with state-of-the-art cameras and other monitoring equipment — pose real privacy threats.

AUVSI and other drone industry leaders agree that the issue must be addressed, and it’s already being tackled across the nation.

More than 30 states and a growing number of local governments have drafted regulations to govern what drones can do and what types of data they can collect.

At the federal level, however, many analysts question why Congress is placing the job in the lap of the FAA.

“The FAA should focus on ensuring the safety of our skies. Safety has always been the FAA’s mission, and we believe the agency should stick to what it does best,” Mr. Gielow said.

The FAA’s experience in writing drone privacy regulations has been limited. The agency has drafted privacy guidelines to be used at drone “test sites,” congressionally mandated locations where the craft will be put through a battery of tests in preparation for airspace integration by 2015.

While widespread, private-sector drone use still is years away, the FAA has begun to make exceptions.

On Friday, the agency issued two “restricted category type certificated” to a pair of unmanned aerial systems, the first step in allowing them to operate in U.S. airspace later this summer.

A “major energy company,” the FAA said, will be allowed to use a drone to survey ocean ice floes and migrating whales in Arctic oil exploration areas off the coast of Alaska. Unmanned systems also will be permitted to support emergency response crews for oil spill monitoring and wildlife surveillance over the Beaufort Sea.

Read more:


Two Drone Airframes Approved for Commercial Flights

By Bob Brewin

July 29, 2013

The Federal Aviation Administration has certified two small drones for commercial use, heralding the move as “one giant leap for unmanned-kind” that will lead to the first approved commercial flight of an unmanned air system off the Alaska coast later this summer.

The 2012 FAA Modernization and Reform Act defined Arctic operational areas and included mandates for Arctic UAS commercial operations. Until now obtaining an experimental airworthiness certificate – which specifically excludes commercial operations – was the only way the private sector could operate UAS in the nation’s airspace, the FAA said.

In an announcement Friday, the agency said it type-certified the Scan Eagle X200, manufactured by Boeing subsidiary Insitu, and the AeroVironment PUMA under a restricted category that allows aerial surveillance. The catapult-launched Scan Eagle, has a maximum weight of 44 pounds, a wingspan of just more than 10 feet and a length of 4.5 feet; it can stay in the air for 28 hours without refueling. The AeroVironment PUMA, a hand-launched drone that weighs 13 pounds, has a wingspan of just over nine feet and a fuselage of four feet.

The agency said previous military acceptance of the Scan Eagle and PUMA UAS designs allowed it to issue the restricted category type certificates. The Navy operates the Scan Eagle and the Air Force, Army, Marines and the Special Operations Command fly the PUMA.

A major energy company plans to fly the Scan Eagle off the Alaskan coast in international waters starting in August, the FAA said, without identifying the company. Plans for the initial ship-launched flights include surveys of ocean ice floes and migrating whales in Arctic oil exploration areas.

The PUMA is expected to support emergency response crews for oil spill monitoring and wildlife surveillance over the Beaufort Sea, but the agency again did not identify the operator.

The certification of the Scan Eagle and the PUMA could be the start of unbridled use of drones in the United States, according to Rep. Ted Poe, R-Texas, who in February predicted that by 2030, “30,000 drones will be cruising American skies – looking, observing, filming, and hovering over America.”

On Feb 4 the FAA kicked off a process to set up six test sites to conduct drone research and development toward a goal of widespread use of UAS by law enforcement agencies, universities and other organizations in domestic airspace by 2015.


Government Conferences Pay for Themselves, Industry Says

By Charles S. Clark

July 29, 2013

Agency spending on travel to conferences is “vital to making government more efficient and effective,” says a study released this month by the U.S. Travel Association. It argues that current Obama administration guidelines and legislation to curb conference spending is counterproductive.

Government travel for meetings and events had a total economic impact of $24.4 billion in 2011, supported 343,800 U.S. jobs and $14.5 billion in U.S. wages, and contributed $5.5 billion in tax revenue, according to the data compiled by Rockport Analytics LLC.

The report put a number on the economic effect of canceling the 2013 Military Health System Conference, an annual training event for several thousand military medical personnel. Replacement expenses and lost revenue, it said, cost the government more than $800,000.

As a result of NASA’s decision to pull out of the April 2013 National Space Symposium, a gathering for representatives of 30 nations, “important international partnerships are jeopardized, important international programs are placed at risk, and the U.S. government places serious strain on relationships with countries around the world,” according to Elliot Pulham, CEO of the private National Space Foundation.


The study also found that government meetings are more efficient than private-sector meetings in terms of expenses, and that private-sector conferences are more productive when government employees participate. A survey found that 89 percent of government supervisors believe that government meetings and events benefit citizens, and 85 percent of government respondents agreed that meetings and events added value to employee development and training.

“We hope these new findings will encourage congressional leaders to reevaluate proposals to drastically cut government travel budgets across the country,” said Roger Dow, president and CEO of the travel association. “When conducted responsibly, federal workers who travel for conferences and meetings deliver important services and real value to our nation.”



Drones Approved for Commercial Flights

By Bob Brewin

July 29, 2013

The Federal Aviation Administration has certified two small drones for commercial use, heralding the move as “one giant leap for unmanned-kind” that will lead to the first approved commercial flight of an unmanned air system off the Alaska coast later this summer.

The 2012 FAA Modernization and Reform Act defined Arctic operational areas and included mandates for Arctic UAS commercial operations. Until now obtaining an experimental airworthiness certificate – which specifically excludes commercial operations – was the only way the private sector could operate UAS in the nation’s airspace, the FAA said.

In an announcement Friday, the agency said it type-certified the Scan Eagle X200, manufactured by Boeing subsidiary Insitu, and the AeroVironment PUMA under a restricted category that allows aerial surveillance. The catapult-launched Scan Eagle, has a maximum weight of 44 pounds, a wingspan of just more than 10 feet and a length of 4.5 feet; it can stay in the air for 28 hours without refueling. The AeroVironment PUMA, a hand-launched drone that weighs 13 pounds, has a wingspan of just over nine feet and a fuselage of four feet.

The agency said previous military acceptance of the Scan Eagle and PUMA UAS designs allowed it to issue the restricted category type certificates. The Navy operates the Scan Eagle and the Air Force, Army, Marines and the Special Operations Command fly the PUMA.

A major energy company plans to fly the Scan Eagle off the Alaskan coast in international waters starting in August, the FAA said, without identifying the company. Plans for the initial ship-launched flights include surveys of ocean ice floes and migrating whales in Arctic oil exploration areas.

The PUMA is expected to support emergency response crews for oil spill monitoring and wildlife surveillance over the Beaufort Sea, but the agency again did not identify the operator.

The certification of the Scan Eagle and the PUMA could be the start of unbridled use of drones in the United States, according to Rep. Ted Poe, R-Texas, who in February predicted that by 2030, “30,000 drones will be cruising American skies – looking, observing, filming, and hovering over America.”

On Feb 4 the FAA kicked off a process to set up six test sites to conduct drone research and development toward a goal of widespread use of UAS by law enforcement agencies, universities and other organizations in domestic airspace by 2015.




The Data Economy Is Much, Much Bigger Than You (and the Government) Think

The Atlantic

July 25, 2013

By Michael Mandel\


It’s become conventional wisdom among pundits that the tech and data boom is generating lots of wealth, but not much in the way of jobs or economic growth. The skeptics point to lack of job gains in the “information” sector, as defined by the Bureau of Labor Statistics, and to the country’s sub-2 percent GDP growth figures.

But as the U.S. shifts to a data-driven economy, the benefits of fixed and mobile broadband are showing up in ways that are not counted by traditional statistics. For just one example, take the number of jobs generated by the development and deployment of mobile apps. According to a new calculation by the Progressive Policy Institute, employment in the App Economy now comes to 752,000 jobs, up roughly 40% over the past year. This is a conservative estimate, based on tracking online help-wanted ads.

Auto companies are hiring software developers and testers to turn their vehicles into highly connected data platforms. Drugstores are going online to let their customers know when prescriptions are ready. Hospitals are ramping up their employment of clinical data managers to help handle the shift to electronic health records. Bed and breakfasts have shifted their entire booking operations online, driven by digital ads.

More broadly, demand for tech workers in the New York City region outstrips every other metro area, including San Francisco and San Jose, according to figures from The Conference Board. That reflects demand in finance, advertising, and media.

The data-driven economy is built on several pillars: Broadband providers, mobile phone operators, and other communications companies are investing almost $100 billion annually to vastly improve their networks. Makers of smartphones, routers, sensors, wireless medical gear, and the like are upgrading and extending the capabilities of their equipment. Meanwhile new applications and uses are coming out of app developers, online game and entertainment companies, web companies like Facebook and Google, content providers, electronic health record providers, and “Internet of Everything” companies that connect the physical world with the data world. Tableau Software, a Seattle-based data visualization company that just went public, increased its full-time employees from 188 to 749 from the end of 2010 to the end of 2012.

What’s more, data is also the fastest-growing component of trade. Consider the United States and Europe: telecom providers have doubled transatlantic cable capacity over the past five years, according to figures from Telegeography. Meanwhile imports and exports of goods and services between the U.S. and Europe are barely above pre-recession peaks.

These flows of data do not show up in the monthly trade report released by the Census Bureau and the BEA. Indeed, most of the growth of data domestically is not counted in the economic statistics either. For example, fixed broadband traffic in North America rose by 39% in the first half of 2013 over a year earlier, according to Sandvine, a Canadian-based network management company. This number does not show up in any official measures.


Will all this growth continue? People still remember the tech bust of the early 2000s, when the unemployment rate in Silicon Valley surged to over 9 percent. This time, though, the surge in data-related jobs is not likely to stop soon. A 2010 policy brief from the Progressive Policy Institute showed that the jobs and industries that grow during a recession are the ones that lead the expansion, and that’s exactly what is happening here.

Before the financial crisis, the housing and debt boom made the U.S. economy look better than it really was, especially housing construction is very visible and easy to measure. By contrast, we may be in the opposite situation now. Data is intangible and difficult to count, so the benefits of the tech and data boom may be underestimated.\



Is Sugar Really Toxic? Sifting through the Evidence

Scientific American

By Ferris Jabr | July 15, 2013


Our very first experience of exceptional sweetness—a dollop of buttercream frosting on a parent’s finger; a spoonful of strawberry ice cream instead of the usual puréed carrots—is a gustatory revelation that generally slips into the lacuna of early childhood. Sometimes, however, the moment of original sweetness is preserved. A YouTube video from February 2011 begins with baby Olivia staring at the camera, her face fixed in rapture and a trickle of vanilla ice cream on her cheek. When her brother Daniel brings the ice cream cone near her once more, she flaps her arms and arches her whole body to reach it.

Considering that our cells depend on sugar for energy, it makes sense that we evolved an innate love for sweetness. How much sugar we consume, however—as well as how it enters the body and where we get it from in the first place—has changed dramatically over time. Before agriculture, our ancestors presumably did not have much control over the sugars in their diet, which must have come from whatever plants and animals were available in a given place and season. Around 6,000 BC, people in New Guinea began to grow sugarcane, chewing and sucking on the stalks to drink the sweet juice within. Sugarcane cultivation spread to India, where by 500 BC people had learned to turn bowls of the tropical grass’s juice into crude crystals. From there sugar traveled with migrants and monks to China, Persia, northern Africa and eventually to Europe in the 11th century.

For more than 400 years, sugar remained a luxury in Europe—an exotic spice—until manufacturing became efficient enough to make “white gold” much more affordable. Christopher Columbus brought sugarcane to the New World in 1493 and in the 16th and 17th centuries European powers established sugarcane plantations in the West Indies and South America. Sugar consumption in England increased by 1,500 percent between the 18th and 19th centuries. By the mid 19th century, Europeans and Americans had come to regard refined sugar as a necessity. Today, we add sugar in one form or another to the majority of processed foods we eat—everything from bread, cereals, crunchy snacks and desserts to soft drinks, juices, salad dressings and sauces—and we are not too stingy about using it to sweeten many raw and whole foods as well.


By consuming so much sugar we are not just demonstrating weak willpower and indulging our sweet tooth—we are in fact poisoning ourselves according to a group of doctors, nutritionists and biologists, one of the most prominent members of which is Robert Lustig of the University of California, San Francisco, famous for his viral YouTube video “Sugar: The Bitter Truth.” A few journalists, such as Gary Taubes and Mark Bittman, have reached similar conclusions. Sugar, they argue, poses far greater dangers than cavities and love handles; it is a toxin that harms our organs and disrupts the body’s usual hormonal cycles. Excessive consumption of sugar, they say, is one of the primary causes of the obesity epidemic and metabolic disorders like diabetes, as well as a culprit of cardiovascular disease. More than one-third of American adults and approximately 12.5 million children and adolescents in the U.S. are obese. In 1980, 5.6 million Americans were diagnosed with diabetes; in 2011 more than 20 million Americans had the illness.

The argument that sugar is a toxin depends on some technical details about the different ways the human body gets energy from different types of sugar. Today, Americans eat most of their sugar in two main forms: table sugar and high-fructose corn syrup. A molecule of table sugar, or sucrose, is a bond between one glucose molecule and one fructose molecule—two simple sugars with the same chemical formula, but slightly different atomic structures. In the 1960s, new technology allowed the U.S. corn industry to cheaply convert corn-derived glucose intro fructose and produce high fructose corn syrup, which—despite its name—is almost equal parts free-floating fructose and glucose: 55 percent fructose, 42 percent glucose and three percent other sugars. Because fructose is about twice as sweet as glucose, an inexpensive syrup mixing the two was an appealing alternative to sucrose from sugarcane and beets.

Regardless of where the sugar we eat comes from, our cells are interested in dealing with fructose and glucose, not the bulkier sucrose. Enzymes in the intestine split sucrose into fructose and glucose within seconds, so as far as the human body is concerned sucrose and high-fructose corn syrup are equivalent. The same is not true for their constituent molecules. Glucose travels through the bloodstream to all of our tissues, because every cell readily converts glucose into energy. In contrast, liver cells are one of the few types of cells that can convert fructose to energy, which puts the onus of metabolizing fructose almost entirely on one organ. The liver accomplishes this primarily by turning fructose into glucose and lactate. Eating exceptionally large amounts of fructose taxes the liver: it spends so much energy turning fructose into other molecules that it may not have much energy left for all its other functions. A consequence of this energy depletion is production of uric acid, which research has linked to gout, kidney stones and high blood pressure.

The human body strictly regulates the amount of glucose in the blood. Glucose stimulates the pancreas to secrete the hormone insulin, which helps remove excess glucose from blood, and bolsters production of the hormone leptin, which suppresses hunger. Fructose does not trigger insulin production and appears to raise levels of the hormone grehlin, which keeps us hungry. Some researchers have suggested that large amounts of fructose encourage people to eat more than they need. In studies with animals and people by Kimber Stanhope of the University of California Davis and other researchers, excess fructose consumption has increased fat production, especially in the liver, and raised levels of circulating triglycerides, which are a risk factor for clogged arteries and cardiovascular disease. Some research has linked a fatty liver to insulin resistance—a condition in which cells become far less responsive to insulin than usual, exhausting the pancreas until it loses the ability to properly regulate blood glucose levels. Richard Johnson of the University of Colorado Denver has proposed that uric acid produced by fructose metabolism also promotes insulin resistance. In turn insulin resistance is thought to be a major contributor to obesity and Type 2 diabetes; the three disorders often occur together.

Because fructose metabolism seems to kick off a chain reaction of potentially harmful chemical changes inside the body, Lustig, Taubes and others have singled out fructose as the rotten apple of the sugar family. When they talk about sugar as a toxin, they mean fructose specifically. In the last few years, however, prominent biochemists and nutrition experts have challenged the idea that fructose is a threat to our health and have argued that replacing fructose with glucose or other sugars would solve nothing. First, as fructose expert John White points out, fructose consumption has been declining for more than a decade, but rates of obesity continued to rise during the same period. Of course, coinciding trends alone do not definitively demonstrate anything. A more compelling criticism is that concern about fructose is based primarily on studies in which rodents and people consumed huge amounts of the molecule—up to 300 grams of fructose each day, which is nearly equivalent to the total sugar in eight cans of Coke—or a diet in which the vast majority of sugars were pure fructose. The reality is that most people consume far less fructose than used in such studies and rarely eat fructose without glucose.

On average, people in America and Europe eat between 100 and 150 grams of sugar each day, about half of which is fructose. It’s difficult to find a regional diet or individual food that contains only glucose or only fructose. Virtually all plants have glucose, fructose and sucrose—not just one or another of these sugars. Although some fruits, such as apples and pears, have three times as much fructose as glucose, most of the fruits and veggies we eat are more balanced. Pineapples, blueberries, peaches, carrots, corn and cabbage, for example, all have about a 1:1 ratio of the two sugars. In his New York Times Magazine article, Taubes claims that “fructose…is what distinguishes sugar from other carbohydrate-rich foods like bread or potatoes that break down upon digestion to glucose alone.” This is not really true. Although potatoes and white bread are full of starch—long chains of glucose molecules—they also have fructose and sucrose. Similarly, Lustig has claimed that the Japanese diet promotes weight loss because it is fructose-free, but the Japanese consume plenty of sugar—about 83 grams a day on average—including fructose in fruit, sweetened beverages and the country’s many meticulously crafted confectioneries. High-fructose corn syrup was developed and patented in part by Japanese researcher Yoshiyuki Takasaki in the 1960s and ’70s.

Not only do many worrying fructose studies use unrealistic doses of the sugar unaccompanied by glucose, it also turns out that the rodents researchers have studied metabolize fructose in a very different way than people do—far more different than originally anticipated. Studies that have traced fructose’s fantastic voyage through the human body suggest that the liver converts as much as 50 percent of fructose into glucose, around 30 percent of fructose into lactate and less than one percent into fats. In contrast, mice and rats turn more than 50 percent of fructose into fats, so experiments with these animals would exaggerate the significance of fructose’s proposed detriments for humans, especially clogged arteries, fatty livers and insulin resistance.

In a series of meta-analyses examining dozens of human studies, John Sievenpiper of St. Michael’s Hospital in Toronto and his colleagues found no harmful effects of typical fructose consumption on body weight, blood pressure or uric acid production. In a 2011 study, Sam Sun—a nutrition scientist at Archer Daniels Midland, a major food processing corporation—and his colleagues analyzed data about sugar consumption collected from more than 25,000 Americans between 1999 and 2006. Their analysis confirmed that people almost never eat fructose by itself and that for more than 97 percent of people fructose contributes less daily energy than other sugars. They did not find any positive associations between fructose consumption and levels of trigylcerides, cholesterol or uric acid, nor any significant link to waist circumference or body mass index (BMI). And in a recent BMC Biology Q&A, renowned sugar expert Luc Tappy of the University of Lausanne writes: “Given the substantial consumption of fructose in our diet, mainly from sweetened beverages, sweet snacks, and cereal products with added sugar, and the fact that fructose is an entirely dispensable nutrient, it appears sound to limit consumption of sugar as part of any weight loss program and in individuals at high risk of developing metabolic diseases. There is no evidence, however, that fructose is the sole, or even the main factor in the development of these diseases, nor that it is deleterious to everybody.”

To properly understand fructose metabolism, we must also consider in what form we consume the sugar, as explained in a recent paper by David Ludwig, Director of the New Balance Foundation Obesity Prevention Center of Boston Children’s Hospital and a professor at Harvard. Drinking a soda or binging on ice cream floods our intestines and liver with large amounts of loose fructose. In contrast, the fructose in an apple does not reach the liver all at once. All the fiber in the fruit—such as cellulose that only our gut bacteria can break down—considerably slows digestion. Our enzymes must first tear apart the apple’s cells to reach the sugars sequestered within. “It’s not just about the fiber in food, but also its very structure,” Ludwig says. “You could add Metamucil to Coca Cola and not get any benefit.” In a small but intriguing study, 17 adults in South Africa ate primarily fruit—about 20 servings with approximately 200 grams of total fructose each day—for 24 weeks and did not gain weight, develop high blood pressure or imbalance their insulin and lipid levels.

To strengthen his argument, Ludwig turns to the glycemic index, a measure of how quickly food raises levels of glucose in the blood. Pure glucose and starchy foods such as Taubes’s example of the potato have a high glycemix index; fructose has a very low one. If fructose is uniquely responsible for obesity and diabetes and glucose is benign, then high glycemic index diets should not be associated with metabolic disorders—yet they are. A small percentage of the world population may in fact consume so much fructose that they endanger their health because of the difficulties the body encounters in converting the molecule to energy. But the available evidence to date suggests that, for most people, typical amounts of dietary fructose are not toxic.

Even if Lustig is wrong to call fructose poisonous and saddle it with all the blame for obesity and diabetes, his most fundamental directive is sound: eat less sugar. Why? Because super sugary, energy-dense foods with little nutritional value are one of the main ways we consume more calories than we need, albeit not the only way. It might be hard to swallow, but the fact is that many of our favorite desserts, snacks, cereals and especially our beloved sweet beverages inundate the body with far more sugar than it can efficiently metabolize. Milkshakes, smoothies, sodas, energy drinks and even unsweetened fruit juices all contain large amounts of free-floating sugars instantly absorbed by our digestive system.

Avoiding sugar is not a panacea, though. A healthy diet is about so much more than refusing that second sugar cube and keeping the cookies out of reach or hidden in the cupboard. What about all the excess fat in our diet, so much of which is paired with sugar and contributes to heart disease? What about bad cholesterol and salt? “If someone is gaining weight, they should look to sugars as a place to cut back,” says Sievenpiper, “but there’s a misguided belief that if we just go after sugars we will fix obesity—obesity is more complex than that. Clinically, there are some people who come in drinking way too much soda and sweet beverages, but most people are just overconsuming in general.” Then there’s all the stuff we really should eat more of: whole grains; fruits and veggies; fish; lean protein. But wait, we can’t stop there: a balanced diet is only one component of a healthy lifestyle. We need to exercise too—to get our hearts pumping, strengthen our muscles and bones and maintain flexibility. Exercising, favoring whole foods over processed ones and eating less overall sounds too obvious, too simplistic, but it is actually a far more nuanced approach to good health than vilifying a single molecule in our diet—an approach that fits the data. Americans have continued to consume more and more total calories each year—average daily intake increased by 530 calories between 1970 and 2000—while simultaneously becoming less and less physically active. Here’s the true bitter truth: Yes, most of us should make an effort to eat less sugar—but if we are really committed to staying healthy, we’ll have to do a lot more than that.



US Appeals Court Upholds Warrantless Collection of Phone Location Data

The court has ruled that cell site information is business data collected by the service provider

By John Ribeiro

Wed, July 31, 2013

IDG News Service (Bangalore Bureau) — Warrants are not required by the U.S. government to access historical cell site information, an appeals court ruled in an order.

The Fourth Amendment to the U.S. Constitution protects only reasonable expectations of privacy, the U.S. Court of Appeals for the Fifth Circuit wrote in a 2-1 ruling on Tuesday. The Fourth Amendment protects against unreasonable searches and seizures.

“Because a cell phone user makes a choice to get a phone, to select a particular service provider, and to make a call, and because he knows that the call conveys cell site information, the provider retains this information, and the provider will turn it over to the police if they have a court order, he voluntarily conveys his cell site data each time he makes a call,” the court added.

Cell site information is clearly a business record, collected by the service provider for its own business purposes, and without being asked to so by the government, the court said in the order.

The dispute hinged around whether law enforcement agents can access cell site data with a relatively easy-to-obtain order under section 2703 (d) of the Stored Communications Act, which is based on a showing of “specific and articulable facts,” instead of using a search warrant after showing probable cause.

Rights groups American Civil Liberties Union and Electronic Frontier Foundation and others have argued that the government should be required to seek a warrant to access the location information, because it is sensitive and can reveal a great deal about a person. The groups argued in court that SCA grants courts the discretion to require the government to obtain a warrant based upon probable cause before accessing historical cell phone location data.

Ruling that compelled warrantless disclosure of cell site data violates the Fourth Amendment, a magistrate judge earlier denied a government request for the historical cell site data in three applications filed in October, 2010 under the SCA for seeking evidence relevant to three separate criminal investigations. The judge, however, allowed for providing subscriber information.

Following an appeal by the government, a district court held that data “disclosing the location of the telephone at the time of particular calls may be acquired only by a warrant issued on probable cause,” as the records would show the date, time called, number, and location of the telephone when the call was made, which is constitutionally protected.

The Fifth Circuit court clarified that its ruling only covered section 2703(d) orders to obtain historical cell site information, and did not address, for example, orders requesting data from all phones that use a tower during a particular interval or “situations where the Government surreptitiously installs spyware on a target’s phone or otherwise hijacks the phone’s GPS, with or without the service provider’s help.”

The Supreme Court of New Jersey ruled earlier this month that cellphone users have a reasonable expectation of privacy of their cellphone location information, and police are required to get a search warrant before accessing the information. People are not promoting the release of personal information to others when making disclosures to phone companies, the court said in an unanimous ruling.



Bradley Manning-WikiLeaks case turns to sentencing

San Francisco Chronicle

By DAVID DISHNEAU, Associated Press

Updated 6:40 am, Wednesday, July 31, 2013

FORT MEADE, Md. (AP) — Acquitted of the most serious charge against him, Army Pfc. Bradley Manning still faces up to 136 years in prison for leaking government secrets to the website WikiLeaks, and his fate rests with a judge who will begin hearing arguments Wednesday in the sentencing phase of the soldier’s court-martial.

The former intelligence analyst was convicted of 20 of 22 charges for sending hundreds of thousands of government and diplomatic secrets to WikiLeaks, but he was found not guilty of aiding the enemy, which alone could have meant life in prison without parole.

“We’re not celebrating,” defense attorney David Coombs said. “Ultimately, his sentence is all that really matters.”

Military prosecutors said they would call as many as 20 witnesses for the sentencing phase. The government said as many as half of the prosecution witnesses would testify about classified matters in closed court. They include experts on counterintelligence, strategic planning and terrorism.

The judge prohibited both sides from presenting evidence during trial about any actual damage the leaks caused to national security and troops in Afghanistan and Iraq, but lawyers will be allowed to bring that up at sentencing.

The release of diplomatic cables, warzone logs and videos embarrassed the U.S. and its allies. U.S. officials warned of dire consequences in the days immediately after the first disclosures in July 2010, but a Pentagon review later suggested those fears might have been overblown.

The judge also restricted evidence about Manning’s motives. Manning testified during a pre-trial hearing he leaked the material to expose U.S military “bloodlust” and diplomatic deceitfulness, but did not believe his actions would harm the country. He didn’t testify during the trial, but he could take the stand during the sentencing phase.

Lisa Windsor, a retired Army colonel and former judge advocate, said the punishment phase would focus on Manning’s motive and the harm that was done by the leak.

“You’re balancing that to determine what would be an appropriate sentence. I think it’s likely that he’s going to be in jail for a very long time,” said Windsor, now in private practice in Washington.

The judge, Army Col. Denise Lind, deliberated three days before reaching her verdict in a case involving the largest leak of documents in U.S. history. The case drew worldwide attention as supporters hailed Manning as a whistleblower and the U.S. government called him an anarchist computer hacker and attention-seeking traitor.

The verdict denied the government a precedent that freedom of press advocates had warned could have broad implications for leak cases and investigative journalism about national security issues.

Whistleblower advocates and legal experts had mixed opinions on the implications for the future of leak cases in the Internet age.

The advocacy group Reporters Without Borders said the verdict was a chilling warning to whistleblowers, “against whom the Obama administration has been waging an unprecedented offensive,” and threatens the future of investigative journalism because intimidated sources might fall quiet.

However, another advocate of less government secrecy, Steven Aftergood of the Federation of American Scientists, questioned whether the implications will be so dire, given the extraordinary nature of the Manning case.

“This was a massive hemorrhage of government records, and it’s not too surprising that it elicited a strong reaction from the government,” Aftergood said.

“Most journalists are not in the business of publishing classified documents, they’re in the business of reporting the news, which is not the same thing,” he said. “This is not good news for journalism, but it’s not the end of the world, either.”

Glenn Greenwald, the journalist, commentator and former civil rights lawyer who first reported Edward Snowden’s leaks of National Security Agency surveillance programs, said Manning’s acquittal on the charge of aiding the enemy represented a “tiny sliver of justice.”

But WikiLeaks founder Julian Assange, whose website exposed Manning’s spilled U.S. secrets to the world, saw nothing to cheer in the mixed verdict.

“It is a dangerous precedent and an example of national security extremism,” he told reporters at the Ecuadorean Embassy in London, which is sheltering him. “This has never been a fair trial.”

Federal authorities are looking into whether Assange can be prosecuted. He has been holed up in the Ecuadorean Embassy in London to avoid extradition to Sweden on sex-crimes allegations.

The material WikiLeaks began publishing in 2010 documented complaints of abuses against Iraqi detainees, a U.S. tally of civilian deaths in Iraq, and America’s weak support for the government of Tunisia — a disclosure Manning supporters said helped trigger the Middle Eastern pro-democracy uprisings known as the Arab Spring.

To prove aiding the enemy, prosecutors had to show Manning had “actual knowledge” the material he leaked would be seen by al-Qaida and that he had “general evil intent.” They presented evidence the material fell into the hands of the terrorist group and its former leader, Osama bin Laden, but struggled to prove their assertion that Manning was an anarchist computer hacker and attention-seeking traitor.



Secretary of Defense Chuck Hagel

Statement on Strategic Choices and Management Review


Good afternoon.

Earlier today, I briefed key congressional committee leaders on the findings of DoD’s Strategic Choices and Management Review. This afternoon, I want to discuss these findings and clarify the major options and difficult choices ahead.

I directed the Strategic Choices and Management Review four months ago to help ensure the Department of Defense is prepared in the face of unprecedented budget uncertainty. Although DoD strongly supports the President’s fiscal year 2014 request and long-term budget plan for the entire federal government, the deep and abrupt spending cuts under sequestration that began on March 1st this year are the law of the land. Sequestration will continue in the absence of an agreement that replaces the Budget Control Act.

The purpose of the Strategic Choices and Management Review – which was led by Deputy Secretary Ash Carter with the full participation of General Dempsey, Admiral Winnefeld, the Service Secretaries and Service Chiefs – was to understand the impact of further budget reductions on the Department, and develop options to deal with these additional cuts. It had three specific objectives:

• Help DoD prepare for how to deal with sequestration if it continues in FY 2014;

• Inform the fiscal guidance given to the military services for their FY 2015

through 2019 budget plans;

• Anchor the upcoming Quadrennial Defense Review,

which will assess our defense strategy in light of new fiscal realities and the many threats, complexities and uncertainties of this new century.


The Strategic Choices and Management Review did not produce a detailed budget blueprint. That was not the purpose of this review. It generated a menu of options, not a set of decisions, built around three potential budget scenarios:

• The President’s FY 2014 budget, which incorporates a carefully calibrated and largely back-loaded $150 billion reduction in defense spending over the next ten years;

• The Budget Control Act’s sequester-level caps, which would cut another $52 billion from defense in fiscal year 2014, with $500 billion in reductions for the DoD over the next ten years;

• An “in-between” scenario that would reduce defense spending by about $250 billion over the next ten years, but would be largely back-loaded.

It is important to remember that all these cuts are in addition to the $487 billion reduction in defense spending over the next decade required by the initial caps in the Budget Control Act of 2011 which DoD has been implementing. If sequester-level cuts persist, DoD would experience nearly a trillion dollars in defense spending reductions over the next ten years.

To help DoD balance strategic ends, ways and means under these budget scenarios, the Strategic Choices and Management Review scrutinized every aspect of DoD’s budget, including: contingency planning, business practices, force structure, pay and benefits, acquisition practices, and modernization portfolios. Everything was on the table.

As I discussed last week at the VFW Convention in Louisville, four principles helped guide this review:

• Prioritizing DoD’s missions and capabilities around our core responsibility of defending our country;

• Maximizing the military’s combat power by looking to reduce every other category of spending first;

• Preserving and strengthening military readiness, and;

• Honoring the service and sacrifice of DoD’s people.

Those principles, and a rigorous review process, resulted in packages of options that included management efficiencies and overhead reductions, compensation reforms, and changes to force structure and modernization plans.

Allow me to share with you some of the options the review identified in each area I just mentioned.



A tenet of the review was that we need to maximize savings from reducing DoD’s overhead, administrative costs, and other institutional expenses.

For several years, DoD has been paring back overhead. About $150 billion in five-year efficiency reductions were proposed by Secretary Gates, an additional $60 billion in savings were identified by Secretary Panetta, and I submitted a $34 billion savings package in our latest budget. DoD is continuing to implement these efficiency campaigns. Despite much progress, as well as good efforts and intentions, not every proposal has generated the savings we expected, or gained the support of Congress – most notably, our request for a base realignment and closure

round. The review showed that DoD will have to do more in this area, even though it is getting more difficult to find these cuts and it can take years for significant savings to be realized. After considering the results of the review, I determined that it is possible and prudent to begin implementing a new package of efficiency reforms now – ones that should be pursued regardless of fiscal circumstances.

Some of these management efficiencies and overhead reductions include:

• Reducing the department’s major headquarters budgets by 20 percent, beginning with the Office of the Secretary of Defense, the Joint Staff, Service Headquarters and Secretariats,

Combatant Commands, and defense agencies and field activities. Although the 20 percent cut applies to budget dollars, organizations will strive for a goal of 20 percent reductions in government civilians and military personnel billets on headquarters staffs;

• Reducing the number of direct reports to the Secretary of Defense by further consolidating functions within OSD, as well as eliminating positions, and;

• Reducing intelligence analysis and production at Combatant Command intelligence and operations centers, which will also foster closer integration and reduce duplication across the defense enterprise.

These management reforms, consolidations, personnel cuts, and spending reductions will reduce the department’s overheard and operating costs by some $10 billion over the next five years and almost $40 billion over the next decade. They will make the Department more agile and versatile.

Past efficiency campaigns have shown that implementation can be very challenging, so effective follow-through is critical if savings targets are to be realized. This is especially true of OSD reductions. I have asked Deputy Secretary Carter to identify someone from outside DoD who is deeply knowledgeable about the defense enterprise and eminently qualified to direct implementation of the OSD reductions and report to the Deputy Secretary.

In addition to the measures I’ve described, the review identified additional consolidations and mission reductions that could be required if sequester-level caps are imposed over the long- term. These measures include consolidations of regional combatant commands, defense agency mission cuts, and further IT consolidation.

These changes would be far-reaching and require further analysis and consideration. Though defense bureaucracies are often derided, the fact is that these offices perform functions needed to manage, administer, and support a military of our size, complexity and global reach.

Even over the course of a decade, the cumulative savings of the most aggressive efficiency options identified by the review are $60 billion. That is a very small fraction of what is needed under sequester-level cuts. We will have to look elsewhere for savings.



The review confirmed that no serious attempt to achieve significant savings can avoid compensation costs, which consume roughly half of the DoD budget. If left unchecked, pay and benefits will continue to eat into readiness and modernization. That could result in a far less capable force that is well-compensated but poorly trained and poorly equipped.

Any discussion of compensation should acknowledge the following:

• No one in uniform is “overpaid” for what they do for this country;

• People are DoD’s most important asset – and we must sustain compensation

packages that recruit and retain the finest military in the world;

• The significant military pay and benefit increases over the last decade reflected

the need to sustain a force under considerable stress – especially the Army and

Marines – during the height of the Iraq and Afghanistan campaigns;

• One post-9/11 war is over, and the second – our nation’s longest war– is coming

to an end;

• Overall personnel costs have risen dramatically – some 40 percent above inflation

since 2001.


The Department cannot afford to sustain this growth. Reflecting these realities, the President’s Fiscal Year 2014 budget included a package of modest compensation-related reforms that have the strong support of our uniformed leadership. Congress has signaled its opposition to some of these proposals, including modest increases in TRICARE fees for working age retirees. But given our current fiscal situation, DoD has no choice but to consider compensation changes of greater magnitude for military and civilian personnel.

The review developed compensation savings options that we believe would continue to allow the military to recruit and retain the high quality personnel we need. If we were to pursue these options, we would need Congress’ partnership to implement many of them. Examples include:

• Changing military health care for retirees to increase use of private-sector insurance when available;

• Changing how the basic allowance for housing is calculated so that individuals are asked to pay a little more of their housing costs;

• Reducing the overseas cost of living adjustment;

• Continuing to limit military and civilian pay raises.

Many will object to these ideas – and I want to be clear that we are not announcing any compensation changes today. Instead, I’ve asked Chairman Dempsey to lead an effort with the Service Chiefs and Senior Enlisted Advisors to develop a package of compensation proposals that meets savings targets identified in the review – almost $50 billion over the next decade – and still enable us to recruit and retain a high-quality force. We would begin implementing this package in the FY 2015 budget. Senior OSD staff will lead a similar review for civilian pay and benefits.

The review also identified more sweeping changes to meet sequester-level targets – such as eliminating civilian pensions for retired military personnel serving in civilian government service, ending subsidies for defense commissaries, and restricting the availability of unemployment benefits. This package would yield savings of almost $100 billion over the next decade, but would have a significant impact on our service members and our workforce. But a sequester-level scenario would compel us to consider these changes because there would be no realistic alternative that did not pose unacceptable risk to national security.



The efficiencies and compensation reforms identified in the review – even the most aggressive changes – still leave DoD some $350 to $400 billion short of the $500 billion in cuts required by sequestration over the next ten years. The review had to take a hard look at changes to our force structure and modernization plans.

The President’s Defense Strategic Guidance anchored this effort. The goal was to find savings that best preserved the tenets of the President’s strategy, such as strategic deterrence, homeland defense, and the rebalance to the Asia-Pacific. The review concluded we should not take reductions proportionally across the military services. Instead, the options we examined were informed by strategy, and they will guide the services as they build two sets of budgets for FY 2015 through 2019 – one at the President’s budget level and one at sequester-level caps.

While we want to preserve flexibility for each military service to develop the best force possible given reduced resources, the review identified areas where we have excess capacity to meet current and anticipated future needs. In particular, the analysis concluded that we can strategically reduce the size of our ground and tactical air forces – even beyond the current draw down.

I have not made any program or force structure decisions, and more analysis will be required before these decisions are made. But with the end of the war in Iraq, the drawdown in Afghanistan, and a changing requirement to conduct protracted, large-scale counterinsurgency operations, it makes sense to take another look at the Army’s force structure – which is currently planned to reach 490,000 in the active component and 555,000 in the reserves.

One option the review examined found that we could still execute the priority missions determined by our defense strategy while reducing Army end-strength to between 420,000 and 450,000 in the active component and between 490,000 and 530,000 in the Army reserves. Similarly, the Air Force could reduce tactical aircraft squadrons – potentially as many as five – and cut the size of the C-130 fleet with minimal risk.

In the months ahead I will work closely with Chairman Dempsey and each of the Service Chiefs to reach agreement on the proper size of our armed forces, taking into account real-world needs and requirements.

A modest reduction in force structure, when combined with management efficiencies and compensation reforms, would enable us to meet the $150 billion in savings required by the

President’s budget proposal while still defending the country and fulfilling our global responsibilities. We can sustain our current defense strategy under the President’s budget request.

Significant reductions beyond the President’s plan would require many more dramatic cuts to force structure. The review showed that the “in-between” budget scenario we evaluated would “bend” our defense strategy in important ways, and sequester-level cuts would “break” some parts of the strategy no matter how the cuts were made. Under sequester-level cuts, our military options and flexibility will be severely constrained.

Given that reality, the review examined two strategic approaches to reducing force structure and modernization that will inform planning for sequester-level cuts. The basic trade- off is between capacity – measured in the number of Army brigades, Navy ships, Air Force squadrons and Marine battalions – and capability – our ability to modernize weapons systems to maintain our military’s technological edge.

In the first approach, we would trade away size for high-end capability. This would further shrink the active Army to between 380,000 and 450,000 troops, reduce the number of carrier strike groups from 11 to eight or nine, draw down the Marine Corps from 182,000 to between 150,000 and 175,000, and retire older Air Force bombers. We would protect investments to counter anti-access and area-denial threats, such as the long range strike family of systems, submarine cruise-missile upgrades, and the Joint Strike Fighter. And we would continue to make cyber capabilities and special operations forces a high priority.

This strategic choice would result in a force that would be technologically dominant, but would be much smaller and able to go fewer places and do fewer things, especially if crises occurred at the same time in different regions of the world.

The second approach would trade away high-end capability for size. We would look to sustain our capacity for regional power projection and presence by making more limited cuts to ground forces, ships and aircraft. But we would cancel or curtail many modernization programs, slow the growth of cyber enhancements, and reduce special operations forces.

Cuts on this scale would, in effect, be a decade-long modernization holiday. The military could find its equipment and weapons systems – many of which are already near the end of their service lives – less effective against more technologically advanced adversaries. We also have to consider how massive cuts to procurement, and research and development funding would impact the viability of America’s private sector industrial base.

These two approaches illustrate the difficult trade-offs and strategic choices that would face the department in a scenario where sequester-level cuts continue. Going forward, in the months ahead, DoD – and ultimately the President – will decide on a strategic course that best preserves our ability to defend our national security interests under this very daunting budget scenario.

The balance we strike between capability, capacity and readiness will determine the composition and size of the force for years to come. We could, in the end, make decisions that result in a very different force from the options I’ve described today. Our goal is to be able to give the President informed recommendations, not to pre-judge outcomes. Regardless, the decision-making process will benefit from the insights this review provided.

In closing, one of the most striking conclusions of the Strategic Choices and Management Review is that if DoD combines all the reduction options I’ve described – including significant cuts to the military’s size and capability – the savings fall well short of meeting sequester-level cuts, particularly during the first five years of these steep, decade-long reductions.

The reality is that cuts to overhead, compensation, and forces generate savings slowly. With dramatic reductions in each area, we do reach sequester-level savings – but only towards the end of a 10-year timeframe. Every scenario the review examined showed shortfalls in the early years of $30-35 billion.

These shortfalls will be even larger if Congress is unwilling to enact changes to compensation or adopt other management reforms and infrastructure cuts we proposed in our fiscal year 2014 budget. Opposition to these proposals must be engaged and overcome, or we will be forced to take even more draconian steps in the future.

A lot has been said about the impact of sequestration. Before this review, like many Americans, I wondered why a 10 percent budget cut was in fact so destructive. Families and businesses trim their costs by similar proportions. But this analysis showed in the starkest terms how a 10 percent defense spending reduction causes in reality a much higher reduction in military readiness and capability. Unlike the private sector, the federal government – and the Defense Department in particular – simply does not have the option of quickly shutting down excess facilities, eliminating entire organizations and operations, or shedding massive numbers of employees – at least not in a responsible, moral and legal way.

The fact is that half of our budget – including areas like compensation where we need to achieve savings – are essentially off limits for quick reductions. Given that reality, the only way to implement an additional, abrupt 10 percent reduction in the defense budget is to make senseless, non-strategic cuts that damage military readiness, disrupt operations, and erode our technological edge. We have already seen some of the significant effects of the $37 billion reduction occurring in this fiscal year – including halting all flying for some Air Force squadrons, cancelling ship deployments, ending Army Combat Training Center rotations for brigades not deploying to Afghanistan, and imposing furloughs for 650,000 DoD civilians.

In Fiscal Year 2014, this damage will continue if sequestration persists. DoD is now developing a contingency plan to accommodate the $52 billion sequester-level reduction in fiscal year 2014, which I outlined in a letter this month to Senate Armed Services Committee Chairman Levin and Ranking Member Inhofe. Congress will need to help us manage these deep and abrupt reductions responsibly and efficiently.

The bold management reforms, compensation changes and force structure reductions identified by the Strategic Choices and Management Review can help reduce the damage that would be caused by the persistence of sequestration in Fiscal Year 2014, but they won’t come close to avoiding it altogether.

The review demonstrated that making cuts strategically is only possible if they are “backloaded.” While no agency welcomes additional budget cuts, a scenario where we have additional time to implement reductions – such as in the President’s budget – would be far preferable to the deep cuts of sequestration. If these abrupt cuts remain, we risk fielding a force that over the next few years is unprepared due to a lack of training, maintenance, and the latest equipment.

As I mentioned last week at the VFW Convention, a top priority in future year budget plans is to build a ready force, even if that requires further reductions in force structure. No matter the size of our budget, we have a responsibility to defend the country and America’s vital interests around the world. That means crafting the strongest military possible under whatever level of resources we are provided.

DoD has a responsibility to give America’s elected leaders, and the American people, a clear-eyed assessment of what our military can and cannot do in the event of a major confrontation or crisis after several years of sequester-level cuts. In the months ahead, we will continue to provide our most honest and best assessment. And the inescapable conclusion is that letting sequester-level cuts persist would be a huge strategic miscalculation that would not be in our country’s best interests. While I’ve focused today on the impact to DoD, sequester-level cuts would equally harm other missions across government that support a strong economy and a strong national defense by providing support to our service members, veterans, and their families. DoD depends on a strong education system to maintain a pool of qualified recruits, we rely on domestic infrastructure that surrounds our bases and installations, and we count on scientific breakthroughs funded by research and development grants and a strong manufacturing base to maintain our decisive technological edge. All of these areas are threatened by sequestration.

It is the responsibility of our nation’s leadership to work together to replace the mindless and irresponsible policy of sequestration. It is unworthy of the service and sacrifice of our nation’s men and women in uniform and their families. And even as we confront tough fiscal realities, our decisions must always be worthy of the sacrifices we ask America’s sons and daughters to make for our country.


DOD strategy review paints bleak outlook

By Amber Corrin

Jul 31, 2013


In a July 31 press briefing discussing findings of a recent comprehensive strategy review, Defense Secretary Chuck Hagel underscored the tough choices Pentagon decision-makers face amid cuts that could reach nearly $1 trillion.

Hagel, accompanied by Joint Chiefs Vice Chairman Adm. James Winnefeld, said that the sweeping strategic choices and management review he directed earlier this year examined three central scenarios, and that all of them fell short of savings targets.

“To help DOD balance strategic ends, ways and means under these budget scenarios, the Strategic Choices and Management Review scrutinized every aspect of DOD’s budget, including: contingency planning, business practices, force structure, pay and benefits, acquisition practices, and modernization portfolios,” Hagel said. “Everything was on the table.”

The three different scenarios include President Barack Obama’s fiscal 2014 budget, which “backloads” $150 billion in defense cuts over a 10-year period; Budget Control Act sequester cap figures, which cuts $52 billion in fiscal 2014 and $500 billion over the next 10 years; and an “in-between” scenario, in which defense spending is reduced by $250 billion over 10 years.


The review was designed to help Pentagon leadership prepare for the possibility of continued sequestration cuts into 2014, as well as to inform out-year budget planning for the services and set the stage for the upcoming Quadrennial Defense Review. Hagel stressed that the review is not a blueprint but an evaluation that produced the three different options.

One area of focus is reducing DOD’s overhead costs, which take up as much as half the Pentagon budget. Hagel said some options should be implemented immediately, regardless of the fate of sequestration.

The focus on overhead and management efficiencies includes measures such as reducing personnel at major military headquarters by 20 percent, starting with the Office of the Secretary of Defense, the Joint Staff, combatant commands and defense agencies. The measures would also consolidate functions in OSD and in intelligence analysis and production, saving up to $40 billion over the next decade.

To help overcome hurdles in executing the cuts, Hagel said he has directed Deputy Defense Secretary Ash Carter to find an expert from outside DOD to help direct implementation.

Force structure and modernization stand to take significant hits in all three scenarios; the cuts would “bend” and could possibly “break” national security strategy, the officials noted.

“Given that reality, the review examined two strategic approaches to reducing force structure and modernization that will inform planning for sequester-level cuts,” Hagel said. “The basic trade-off is between capacity – measured in the number of Army brigades, Navy ships, Air Force squadrons and Marine battalions – and capability – our ability to modernize weapons systems to maintain our military’s technological edge.”

Trading capacity for capability would result in a more technologically dominant military, but one that is smaller and able to perform fewer missions. Opting for size over capability would sustain power projection and presence, but would seriously slow or curtail modernization programs and cripple cyber development, Hagel said.

“Cuts on this scale would, in effect, be a decade-long modernization holiday,” he said. “The military could find its equipment and weapons systems – many of which are already near the end of their service lives – less effective against more technologically advanced adversaries. We also have to consider how massive cuts to procurement and research and development funding would impact the viability of America’s private sector industrial base.”

And even if DOD combined all the different options, it leaves the Pentagon well short of the targeted reduction levels under sequestration, Hagel and Winnefeld noted. They also warned that it will be even worse if Congress does not cooperate with proposed cuts – any proposal to freeze military pay, for example, could be expected to meet stiff resistance on the Hill.

“The reality is that cuts to overhead, compensation and forces generate savings slowly. With dramatic reductions in each area, we do reach sequester-level savings – but only towards the end of a 10-year timeframe. Every scenario the review examined showed shortfalls in the early years of $30-35 billion,” Hagel said. “These shortfalls will be even larger if Congress is unwilling to enact changes to compensation or adopt other management reforms and infrastructure cuts we proposed in our fiscal year 2014 budget. Opposition to these proposals must be engaged and overcome, or we will be forced to take even more draconian steps in the future.”


Army’s Problems go Deeper Than SCMR

Defense News

Aug. 1, 2013 – 05:29PM | By PAUL McLEARY


WASHINGTON — Few in the Pentagon or the defense industry liked what they heard Wednesday at Defense Secretary Chuck Hagel’s press conference announcing the findings of his four-month Strategic Choices and Management Review (SCMR).

But the Army probably didn’t like what it didn’t hear, as well.

The secretary laid out two paths if Congress and White House remain unable to reach a budgetary “grand bargain” that would reverse the $500 billion in budget cuts that the Pentagon will face over the next decade, beginning with a $52 billion hit in 2014.

One plan would prioritize high-end capabilities over end-strength numbers, while the other would keep end-strength while sacrificing modernization and research and development on next-generation platforms.

While Hagel was short on specifics when it came to platforms that would or wouldn’t be modernized, the secretary provided a hint when he said “we would protect investments to counter anti-access and area-denial threats, such as the long range strike family of systems, submarine cruise-missile upgrades and the Joint Strike Fighter. And we would continue to make cyber capabilities and special operations forces a high priority.”

Notice anything there? No Army platforms were mentioned, save those few presumably used by special operators.

That doesn’t mean that key Army modernization priorities like the ground combat vehicle or joint light tactical vehicle are doomed under Hagel’s scenario. But being left out of a roll call of the Pentagon’s highest priorities may make some people nervous.
Not only was the Army left out of the list of critical programs, but under either plan it would also take by far the largest hit in terms of end strength.

With the end of US involvement in Iraq and the ongoing drawdown in Afghanistan, the Pentagon has concluded that the service could drop as low as 450,000 to 380,000 soldiers, numbers which haven’t been seen since before World War II.

The wartime high of 570,000 grunts was always seen as a temporary spike — save for some in the Army who wanted to keep what they had gained — but the service is working on culling 80,000 troops to reach 490,000 by the end of 2017.

When it comes to prioritizing modernization vs. end-strength cuts, “I would suspect the first impulse would be to protect as much combat power as you could,” said Maren Leed, a former adviser to the US Army chief of staff who is now at the Center for Strategic and International Studies.

Doing so would mean deeply reducing headquarters elements, combining major commands, and the “further pooling of enablers at higher echelons to get those as lean as they can before they start going after combat power.”

That said, “the Army’s going to have a difficult time holding on to as much [combat power] as they might like to” if the cuts are not reversed, she said.

The dirty little secret in the rush to gain some cost savings, however, is that even letting go of 80,000 soldiers won’t actually save the Army a dime.

All of the funds earmarked for paying those soldiers over the 490,000 threshold come from supplemental war accounts, and don’t count toward any sequestration savings — which means barring deeper and faster cuts, the service won’t save any money on force reductions until the fiscal 2018 budget.


And the service desperately needs those savings. The Army already spends 46 percent of its budget on compensation, a number that service chief Gen. Ray Odierno has warned will rise to 80 percent in a decade if compensation trends continue.

What’s more, even forcibly separating soldiers won’t reduce the strain on budgets all that much. Service contracts include provisions for unemployment and other benefits for about a year after a soldier leaves the force, so the service still has to pay for former soldiers months after they separate.

Taking soldiers out of the ranks is one thing. Resizing units to reflect those reductions while still retaining combat punch is another. The Army announced in June that it was reducing the number of brigade combat teams from 45 to 33 — while protecting its overall combat wallop by keeping 95 out of its 98 combat battalions.

The plan is to take the cuts in headquarters positions across the brigades while increasing the number of maneuver battalions in each brigade from two to three, while adding engineering and fires capabilities to each unit.

Odierno called the moves “one of the largest organizational changes probably since World War II” for the service.

“If we go though full sequestration there’s going to be another reduction in brigades, there’s no way around it,” Odierno warned, adding that there will likely be more cuts coming in the heavy armor brigades, sequestration or not.

Fewer brigades, fewer soldiers, less money, and an uncertain modernization profile. With all of this in flux, what missions will the Army prioritize in the future?

“The most important thing that they’ve got to be concerned about is the Korean war plan since it doesn’t necessarily align that well with all the other things the Army believes it also needs to be doing,” Leed said. Those missions include things that Army leadership has spent a lot of time talking about in recent months, such as partner engagement in Asia and Africa, humanitarian response and training for contingencies spanning counterinsurgency to peer combat.

But the continuing instability on the Korean peninsula will mean that “they will be highly reluctant to take risk [in Korea] because of the criticality of it.”

The Army National Guard and reserve — much used in the past decade of conflict but largely spared from the current round of drawdowns in ground force end strength — would also be due for a haircut absent any grand bargain. They would fall from 555,000 soldiers to between 490,000 and 530,000 under the two scenarios.

One of the key questions to be considered when taking combat power out of the active force, but trying to maintain capability in the Guard and reserve, is to what degree can the Guard mitigate various kinds of risk? “Much of the Guard is not particularly well suited to meet the short term-risk in Korea,” for example, Leed said. But “when you’re talking about missions that align well with their competencies they can step in almost immediately.” Missions such as small unit training and advising, medical support, engineering and partnership missions are things that the Guard has traditionally performed well.

When it comes down to it, under any sequestration scenario “the whole Army would be the bill payer,” said Thomas Donnelly, a defense policy analyst at the American Enterprise Institute. Any cut to end strength or modernization would affect the other in serious ways, and would also impact the way the service could respond to contingencies.

The Pentagon has now laid out its thinking — absent any major change in national defense strategy — and now Congress and the White House will have their say.


Senate Panel Approves $594.2B DoD Bill Despite Worries About Spending Caps

Defense News

Aug. 1, 2013 – 12:35PM | By JOHN T. BENNETT

The Senate Appropriations Committee on Thursday approved a $594.2 billion Pentagon spending bill, with some Republicans mounting a symbolic resistance because it surpasses federal spending caps.

The panel’s bill, approved 22-8, would give the Pentagon a $516.4 billion base budget and a $77.8 billion war-funding section. The base section’s topline aligns with the Obama administration’s request, while the Overseas Contingency Operations portion would be $8 billion smaller than the White House’s request.

The opening minutes of a full committee mark up of the legislation focused on a back-and-forth about the Democratic-controlled panel’s decision to push a bill that’s nearly $20 billion larger than national defense spending caps etched into law by the 2011 Budget Control Act.

“The defense bill alone would exceed the Budget Control Act caps for defense-related spending by nearly $19 billion,” committee ranking member Richard Shelby, R-Ala., said. “That’s not even counting the defense-related-spending in the committee’s other bills, which comes to an additional $35.6 billion.” All told, the panel’s 2014 spending bills are $91 billion over defense and domestic spending limits, Shelby says.

Without a major fiscal deal that addresses the spending caps, another sequestration round of across-the-board cuts would be triggered. Instead, Shelby called for “deliberate [cuts] that reflect decisions by Congress about strategic priorities.”

Shelby and seven other Republicans voted against the defense bill, with several echoing the ranking member’s comments.

Chairwoman Barbara Mikulski, D-Md., said the reason her defense and other appropriations bills peak those caps lies with the House and Senate Budget committees — particularly the lower chamber’s panel.


“We need a topline so we can get to the bottom line,” Mikulski said. “We have marked up our bills to a topline of $1.058 trillion, the level in the American Taxpayer Relief Act, which was approved by the Senate by a vote of 89-8.”

She noted the defense bill and others “assume that sequester will be replaced with a balanced solution of revenues and smart cuts.”

Mikulski criticized the House for, in legislation it has passed, building in “a moat around defense so that all $91 billion in cuts come out of domestic funding bills.”

President Obama on Wednesday, during a meeting with Senate Democrats, indicated he will not support placing such a “moat” around Pentagon spending when fiscal negotiations kick into full steam this fall, according to lawmakers who attended.

In a sign that rank-and-file lawmakers have moved little from ideological stances taken since 2010, Shelby called for “meaningful spending cuts in mandatory accounts,” meaning “entitlement reform.” Democrats largely have opposed deep cuts to such programs; Republicans oppose more new revenues that Democrats want.


Senate Budget Committee Chairwoman Patty Murray, D-Wash., acknowledged the differences between the House and Senate budgets is a problem.

“Everyone knows there’s going to have to be a compromise at some point,” Murray said during the mark up. “We’re not going to solve this by kicking the can to someone else — it’s up to us.”

Sparks also flew during the session over a provision offered by Sen. Tom Udall, D-N.M., then amended by Sen. Dianne Feinstein, D-Calif.

Udall’s measure would cut off funding for any US operation in Syria that violates the War Power Act. After several senators raised concerns, Feinstein inserted language specifically pointing to the US armed services and a military operation.

The measure passed via voice vote, with several, including Sen. Lindsey Graham, R-S.C., audibly voting no.

Graham argued against both the initial and the revised amendment, arguing it would trample on the president’s constitutional powers as commander in chief.

“To my Republican colleagues who suddenly support the War Powers Act, where were you during the Bush years?” Graham said.

The 1973 law requires presidents to secure congressional approval for military operations within 60 days, or withdraw forces within the next 30. Since it was passed in the wake of the Vietnam War, Congress largely has looked the other way when presidents from both political parties have launched military operations that stretched into a 61st day and beyond.


Furlough appeals keep coming

Washington Post

By Lisa Rein, Published: August 2 at 6:00 am

The small agency receiving thousands of appeals from furloughed federal employees has delayed docketing and hearing most of them them until the volume slows down, officials said Thursday.

The surge of challenges arriving at the Merit Systems Protection Board hit 13,090 on Thursday, with 1,900 of those arriving by fax, snail mail and electronically on Wednesday alone. That’s double the appeals that had come in a week ago.

With its support staff and administrative judges overwhelmed, the merit board has put a hold on all the cases from Defense Department civilians, the largest group of federal workers taking unpaid days to meet the budget cuts known as sequestration.

The board posted a message on its Web site this week that said: “Due to the unprecedented large volume of furlough appeals being received from employees of the military services and Department of Defense activities —and after much consideration —MSPB has decided to delay the processing and adjudication of these appeals.”

General Counsel Bryan Polisuk said that once the Defense appeals slow down after Aug. 8 — the 30-day filing deadline from the start of furloughs on July 8 — the staff will resume sorting through them.

“We’ll be in a better position [then] to see what we have, and start making decisions on how to adjudicate these cases,” he said. “Given the volume of furlough appeals, it’s going to be difficult for our regions to be as responsive in the near future as they normally are.” The merit board has eight offices across the country.

Appeals from employees furloughed from other agencies, including the Internal Revenue Service, the Environmental Protection Agency and the Department of Housing and Urban Development, are moving forward.

Administrative judges have ruled on 11 appeals so far, one from an EPA employee and 10 from the Federal Aviation Administration, which took one furlough day in April before Congress gave the agency authority to redirect money from other areas of its budget to shore up staffing and operations.

The ruling upheld the furloughs, meaning that the employees will not get back pay.

The high volume of appeals temporarily knocked out the merit board’s electronic “e-Appeal” service several times this week. Polisuk said it was quickly restored.

The appeals represent a new attack by federal employee unions on the Obama administration’s decision to furlough about 775,000 employees this fiscal year to help meet $85 billion in budget cuts.

The unions say federal agencies had other options than forcing employees to lose multiple days of pay. They are holding seminars with their locals across the country to instruct their members how to file appeals, which seek to cancel the furloughs and recover back pay.

Several unions have asked the merit board to consider the appeals as class-action cases, a request that is under consideration.

It’s unclear whether the merit board will meet its average turnaround time of 120 days to decide the cases, Polisuk said.

“It’s a very fluid situation.”



Do We Need A U.S. Department Of Technology?

A cabinet-level Dept. of Tech will help U.S. retain leadership position in the global high-technology sector, argues former Department of Transportation CIO Nitin Pradhan.

By Nitin Pradhan, InformationWeek

August 01, 2013



I came to the U.S. in the early 1990′s, on a fellowship from an American university. I was exploring several countries, including Australia, Canada and New Zealand, but I decided on the U.S. because of its leadership in technology. I have been fortunate to have a great career here, including being appointed by President Obama as the CIO of the Department of Transportation and the FAA in his first term.

Now my “iKids” are growing up and I want them to thrive in the U.S. technology industry. But will this industry continue to flourish in U.S.? What can we do today to ensure our continued technology leadership in the future?


Why Tech Industries Grow

Tech industries grow because of the availability of research and development dollars, a high-quality education system, a tech-savvy workforce, a large local technology marketplace and government incentives. The U.S. has no intrinsic advantage in the technology industry. Past wins have been a function of dollars invested, bipartisan leadership and lack of global competition.


However, now the global competition is heating up — just ask Apple, HP, Ericsson and Boeing, and they’ll tell you Samsung, Lenovo, Huawei and Airbus are tough global competitors. South Korea, China and the European Union governments are investing heavily in the growth of the tech industry. The U.S. government is investing in technology industry too, but its focus is mainly on defense, and with sequestration these investments are being reduced. I believe a more direct, consolidated, coordinated and planned approach toward technology investments is needed to keep our leadership strong.


Tech Remains the U.S.’s Future

According to TechAmerica Foundation’s 2013 Cyberstats report, U.S. high-tech employment totaled 5.95 million in 2012, with average wages of $93,800 — 98% more than the average private sector wages of $47,000. According to some additional recent data, U.S. high-tech jobs are growing at three times the rate of other private sectors, and each tech job creates more than four jobs in the wider economy, thanks to wages that are 17-27% higher than other sectors. If we want to create more of these jobs for our kids, we need a mechanism to support that future.

We need to create a new cabinet-level Department of Technology (USDoTech) now, while we are still leading in the technology world. The goal of the department should be to drive collaborative public-private technology innovations that maximize public value through private growth.

The notion of a cabinet-level technology department is not new. James Harold Wilson, the former Prime Minister of the United Kingdom, established MinTech, the first Ministry of Technology, in 1964. China today has Ministry of Science and Technology; India has a Ministry of Communications and Information Technology; and South Korea has MSIP (the Ministry of Science), ICT and Future Planning to “build a creative economy for the happiness of all people.”


How to Create USDoTech?

President Obama, to his credit, established the first chief information officer and chief technology officer positions in the federal government. However, neither has cabinet-level authority, and with few resources available, the impact on the growth of this important sector has been limited.

One way create a cabinet-level department is by consolidating a number of technology-centric offices spread across various federal agencies that often work in an uncoordinated and sometimes even counterproductive way. Some examples include the technology-focused sections of the Office of Management and Budget (OMB), the Office of Science and Technology Policy (OSTP), the National Science Foundation (NSF), the National Institute of Standards and Technology (NIST) and many others.

Centralizing the technology leadership functions in these departments will give government a clearer, more holistic picture of the needs, challenges, opportunities, and threats for this fast-growing sector, allowing it to more effectively craft policies, legislation, and regulations and promote appropriate public-private investment strategies to keep U.S. leadership dynamic. We don’t need a humongous new department like Homeland Security – just a right-sized, efficient, tech-savvy group that will deliver results.


Congress: Technology Oversight Needs Change Too

As mentioned in my recent article in the Fast Government Report published by The IBM Center for the Business of Government, the Congressional governance of federal government investments in the technology portfolio is fragmented and is therefore not conducive to seeing the benefits of integrated approaches to technology.

Congress has 21 Senate committees, 22 House committees and many more subcommittees, which directly or indirectly have oversight over technology initiatives and investments in federal agencies. However, technology today is highly connected infrastructure, and a holistic view and investment strategy is key to future success. It is therefore essential that Congress establish a technology committee focused on maximizing transformative use of technology and effective involvement of private industry for the benefit of the country.


Next Steps: Get Involved

How do we create the “USDoTech” with this polarized Congress? Crowdsourcing, of course!

If you support the concept of a cabinet-level technology department, forward this article and talk to your friends and family today. “Like” the initiative on Facebook, and suggest the roles and responsibilities for this new department now. Call your Congressional representatives and senators and ask them to enlist Congressional Research Services (CRS), the Government Accountability Office (GAO) and the Congressional Budget Office (CBO) to study the matter and work with the private sector to build a definitive bipartisan plan.

Finally, if your company has a government affairs department, ask it to lobby the federal government to support this worthy cause. Together, we can create this necessary department with no increased cost to taxpayers and keep the U.S. a leader in high-tech for years to come.



Rasmussen Reports

What They Told Us: Reviewing Last Week’s Key Polls

Saturday, August 03, 2013


In this 50-50 nation, Americans remain closely divided over whether government is the problem or the solution.

Just over half of voters still view the federal bailouts of the financial industry negatively, and 56% believe more competition and less government regulation are better for the U.S. financial system than more regulation and less competition.

Fifty-three percent (53%) don’t like President Obama’s national health care law, but 42% do.

The president in budget negotiations with congressional Republicans has proposed cuts in corporate tax rates in exchange for a new federal jobs program. Thirty-eight percent (38%) view Obama’s combination approach as the better way to create new jobs, but nearly as many (36%) think cutting corporate tax rates alone would be more effective. After all, 66% think decisions made by U.S. business leaders to help their own businesses grow will do more to create jobs than decisions made by government officials.

Voters are almost evenly divided these days when asked which party they trust more to handle the 15 important issues regularly tracked by Rasmussen Reports. They continue to trust Republicans most on the number one issue, the economy, and other money issues like taxes, job creation and government spending. They trust Democrats more in areas such as energy policy, the environment, health care and education.

This ambivalence is also reflected in the Generic Congressional Ballot where the two parties have been separated by two points or less every week since mid-April.

Obama’s total job approval held steady at 47% in July, tying the president’s lowest approval rating since December 2011. Fifty-one percent (51%) disapproved of the president’s performance last month.

The president’s daily job approval ratings now are more in line with findings during most of his first term in office.

Still, just 10% of voters now rate Congress’s performance as good or excellent, although this is the first time Congress’ positives have reached double digits this year. Sixty-three percent (63%) believe most members of Congress are willing to sell their vote for cash or campaign contributions, and 55% think it’s at least somewhat likely that their own representative in Congress has sold his or her vote for cash or a campaign contribution.

“To see where the country is heading, ignore Washington,” Scott Rasmussen says in his latest weekly newspaper column, “because most of what the [political] parties talk about is yesterday’s news and is largely irrelevant to the realities of the 21st century.”


And what are those realities?

For the second week in a row, 26% of voters say the country is heading in the right direction. This is the lowest level of confidence since mid-January 2012.

The Rasmussen Employment Index which measures worker confidence dropped 10 points in July to its lowest level since last November. As the Index projected, the government released another tepid jobs report on Friday.

Most consumers and investors believe the U.S. economy is currently in a recession.

Thirty-three percent (33%) of Americans believe the U.S. economy will be stronger in a year, but slightly more (39%) feel the economy will be weaker one year from now.

Following news of Detroit’s bankruptcy filing, 55% believe some states may soon be filing for bankruptcy, too.

Thirty-seven percent (37%) think now is a good time for someone in their area to sell a home.

Despite big gains made in the stock market in July, only 22% think the stock market will be higher than it is today a year from now.

Twenty-nine percent (29%) of Working Americans are now looking for work outside of their current company, the highest finding measured since March 2011. But they remain closely divided as to whether staying or leaving offers them the best career opportunities.

Seventy-seven percent (77%) of all Americans think the current minimum wage of $7.25 an hour is not a wage someone can live on, and 61% favor raising the minimum wage to $10.10 an hour by 2015, a proposal that is currently being considered by Congress.

But while Americans recognize that more minimum-wage fast-food jobs are now being held by workers who are over 20, 74% oppose making the minimum wage different for those in different age groups.


In other surveys last week:

– Even before a military judge handed down his guilty verdict against Bradley Manning on Tuesday, 46% of voters said the former Army intelligence analyst was a traitor for leaking government secrets. But just 33% now want Manning to spend the rest of his life in prison. That’s down from 41% two months ago.


– Thirty-two percent (32%) think Edward Snowden, the private contractor who exposed the National Security Agency’s spying on domestic phone calls, is a traitor, unchanged from last month.

– Forty-six percent (46%) favor putting greater restrictions on the NSA’s tracking of Americans’ telephone calls.

– A recent major study suggests that an increasing number of American women are now the primary breadwinners in married couples, but for most men and women, the man remains the chief earner.


August 10 2013




A Call for Cyber Diplomacy

By Joseph Marks

August 2, 2013


LAS VEGAS — If the United States’ goal is peace and security in cyberspace, officials should think less about cyberwar and more about cyber diplomacy, a scholar from the Atlantic Council said on Thursday.

For all of the talk about lightning attacks that come out of nowhere and the often inscrutable language of experts, cyber conflicts at the national level tend to mirror traditional conflicts much more than you’d expect, Jason Healey, director of the council’s Cyber Statecraft Initiative said during the Black Hat security convention here.

By the time a major attack, say, against the U.S. financial system is being dealt with at the top levels of government, the stakes and strategies are very similar to a traditional national security crisis, said Healey, who was director for cyber infrastructure protection at the White House from 2003 to 2005.

“What’s happening here is not that different from a coup in Pakistan,” he said. On a tactical level, government responders will be calling up bankers and trying to help secure their servers rather than securing embassy staff. But, at a strategic level, the response should be similar, he said.

That means “the president needs to get on the phone with Mr. Putin,” or whichever leader is likely behind the attack, and figure out a way to make it stop, he said.

Healey was editor of the Atlantic Council book A Fierce Domain: Conflict in Cyberspace, 1986 to 2012. While individual cyberattacks may happen with lightning speed, large cyber conflicts such as Russia’s 2007 attacks on Estonia and the Stuxnet attack, reportedly launched by the U.S. against Iranian nuclear facilities, tend to unfold over time and allow for thoughtful decision-making by top leaders.

Cyber diplomacy, he argued, can also be strategic and thoughtful. Some of this diplomacy has taken place in multilateral contexts, he said, such as recent discussions about whether the laws of war apply in cyberspace. More often, it will happen in a bilateral context such as the recent agreement between the U.S. and Russia to install a cyber hotline.


Surveillance scandal rips through hacker community

The good ol’ days of chummy games of “Spot the Fed” at Defcon are finished as hackers and security entrepreneurs plan next steps in the wake of government spying revelations.

by Seth Rosenblatt August 6, 2013 4:00 AM PDT



LAS VEGAS — It used to be that the playful Defcon contest of “Spot the Fed” gave hackers and the government agents tracking them a chance interact in a less serious manner.

Hackers who found a government agent among the conference attendees would wear with pride T-shirts that read, “I spotted the Fed.” The agent would be given a shirt that read, “I am the Fed.” And by flipping the cat-and-mouse dynamic for at least one weekend a year, the two groups more or less came to a greater understanding of each other.

The relationship had gotten friendly enough so that when Gen. Keith Alexander, the director of the National Security Agency, visited Defcon for the first last year, the conference’s founder Jeff Moss told CNET, “It’s eye-opening to see the world from their view,” and noted that he had wanted to score an official visit from the NSA since Defcon began.

It would go too far to say that the uneasy marriage of friendship between the two groups now lies torn asunder in an acrimonious divorce. Hackers, information security professionals, and security experts looking to turn their knowledge into businesses won’t stop working or communicating with the U.S. government. But the response to the scandal has driven many of them back to their more skeptical roots.

“What we need to realize that [Gen. Alexander] is asking us to destroy ourselves,” said Moxie Marlinspike, a well-known information security professional and entrepreneur who has had equipment seized and returned and been detained but never charged by the U.S. government.

“The types of programs that he’s developing are part of this trend of more efficient law enforcement, toward this world of 100 percent effective law enforcement,” said Marlinspike, who uses the alias to protect his legal name.

Marlinspike told CNET that he thinks the NSA is interested in hiring hackers because they tend to have an approach to critical thinking that produces an unusual mindset. Hackers are people, he said, who are “not always abiding by the letter of the law, but are not exactly harmful, either.”

“The problem is that he’s asking us to participate in the destruction of the zone where hackers exist,” Marlinspike said.


No single hacker voice on NSA

Information security professionals are not unified in their interpretation of Alexander’s attempt at a mea culpa at last week’s Black Hat conference here. Alex Stamos, a network infrastructure and security expert and the chief technical officer of Artemis, the company proposing the .secure top-level domain for a safer Internet, said that Alexander was actually aiming his talk not at independent security researchers but the security teams at security companies.

“If you’re a security researcher at a large cloud company, you have to include the NSA on your list of threats that you have to protect against,” he said. Stamos has done government security consulting in the past, although he told an audience at his Defcon session discussing professional ethics for “white hat” or “good guy” hackers that he would reconsider doing so in the future.

Christopher Soghoian, a senior policy analyst and principal technologist with the American Civil Liberties Union, agreed. “I think you’ve got an extra threat in your threat model, and that threat is the NSA.”

Marc Maiffret, a white hat hacker who narrowly avoided serious legal consequences for his teen hacking and has testified in front of Congress on security issues, said that the situation is more than a little ironic.

“We don’t want the NSA to monitor anything, but the whole goal of what [security professionals] do for the most part is to monitor everything. We should have the same safeguards to make sure that those abuses aren’t happening,” he said, referring to the recent surveillance revelations leaked by Edward Snowden.

The ACLU’s Soghoian said that the lack of public discussion is at the core of the problem and has impeded the government’s achieving its stated security-and-safety goals.

“The FBI has a unit now that does nothing but hack into people’s computers, extract documents, control Webcams,” he said. “The FBI’s role as an offensive cyber actor significantly undermines their cause. How can an agency warn people about malware when it’s using malware itself?”

One security start-up that had an encounter with the FBI was Wickr, a privacy-forward text messaging app for the iPhone with an Android version in private beta. Wickr’s co-founder Nico Sell told CNET at Defcon, “Wickr has been approached by the FBI and asked for a backdoor. We said, ‘No.’”

The mistrust runs deep. “Even if [the NSA] stood up tomorrow and said that [they] have eliminated these programs,” said Marlinspike, “How could we believe them? How can we believe that anything they say is true?”

Where does security innovation go next?

The immediate future of information security innovation most likely lies in software that provides an existing service but with heightened privacy protections, such as webmail that doesn’t mine you for personal data.


Wickr’s Sell thinks that her company has hit upon a privacy innovation that a few others are also doing, but many will soon follow: the company itself doesn’t store user data.

“[The FBI] would have to force us to build a new app. With the current app there’s no way,” she said, that they could incorporate backdoor access to Wickr users’ texts or metadata.

“Even if you trust the NSA 100 percent that they’re going to use [your data] correctly,” Sell said, “Do you trust that they’re going to be able to keep it safe from hackers? What if somebody gets that database and posts it online?”

To that end, she said, people will start seeing privacy innovation for services that don’t currently provide it. Calling it “social networks 2.0,” she said that social network competitors will arise that do a better job of protecting their customer’s privacy and predicted that some that succeed will do so because of their emphasis on privacy.

Abine’s recent MaskMe browser add-on and mobile app for creating disposable e-mail addresses, phone numbers, and credit cards is another example of a service that doesn’t have access to its own users’ data.

Stamos predicted changes in services that companies with cloud storage offer, including offering customers the ability to store their data outside of the U.S. “If they want to stay competitive, they’re going to have to,” he said. But, he cautioned, “It’s impossible to do a cloud-based ad supported service.”

Soghoian added, “The only way to keep a service running is to pay them money.” This, he said, is going to give rise to a new wave of ad-free, privacy protective subscription services.

The issue with balancing privacy and surveillance is that the wireless carriers are not interested in privacy, he said. “They’ve been providing wiretapping for 100 years. Apple may in the next year protect voice calls,” he said, and said that the best hope for ending widespread government surveillance will be the makers of mobile operating systems like Apple and Google.

Not all upcoming security innovation will be focused on that kind of privacy protection. Security researcher Brandon Wiley showed off at Defcon a protocol he calls Dust that can obfuscate different kinds of network traffic, with the end goal of preventing censorship.

“I only make products about letting you say what you want to say anywhere in the world,” such as content critical of governments, he said. Encryption can hide the specifics of the traffic, but some governments have figured out that they can simply block all encrypted traffic, he said. The Dust protocol would change that, he said, making it hard to tell the difference between encrypted and unencrypted traffic.

It’s hard to build encryption into pre-existing products, Wiley said. “I think people are going to make easy-to-use, encrypted apps, and that’s going to be the future.”

Longer-term solutions

Right now, the intersection of individual information security experts, governments, and private companies large and small is at a crisis point. How they untangle and compromise their competing interests could have far-reaching ramifications for all.

Maiffret, the teen hacker turned respected infosec expert both inside and outside the infosec community, thinks that the government is going to have to give up some ground.

“I think they know they need to say more. How do you say more, and not jeopardize things,” is the question, he said. “There is a better middle ground. It’s just like businesses accepting social media, it won’t happen overnight.”

Companies could face severe consequences from their security experts, said Stamos, if the in-house experts find out that they’ve been lied to about providing government access to customer data. You could see “lots of resignations and maybe publicly,” he said. “It wouldn’t hurt their reputations to go out in a blaze of glory.”

Perhaps not surprisingly, Marlinspike sounded a hopeful call for non-destructive activism on Defcon’s 21st anniversary. “As hackers, we don’t have a lot of influence on policy. I hope that’s something that we can focus our energy on,” he said.


One Kickstarter Could Revolutionize 3D Printing

By Rachel Feltman

August 2, 2013


One of the biggest obstacles to at-home 3D printing is the difficulty of designing objects to print, but the days of struggling to learn CAD or trawling for templates online may be over. Fuel3D, Inc has just launched (and, in one day, successfully funded) a Kickstarter for a handheld, point-and-shoot, full color 3D scanner. That means you can now print anything you can take a picture of—for about $1,000.

3D scanners aren’t new, but other handheld devices cost upwards of $15,000, according to Fuel3D (they can’t be too far off, since all our attempts to verify led to intimidating “call for pricing” pages.) The tech of this particular model was first developed for medical imaging at Oxford University, and unlike a traditional camera, which just captures enough data to provide the illusion of depth, the Fuel3D actually records the geometry of an object. It uses geometric stereo (capturing an image from two different locations to perceive depth) and photometric stereo (capturing an image under different lights to perceive depth) simultaneously.





States Consider Regulation of Drones in US Skies

Associated Press

By LISA CORNWELL Associated Press

CINCINNATI August 4, 2013 (AP)


Thousands of civilian drones are expected in U.S. skies within a few years and concerns they could be used to spy on Americans are fueling legislative efforts in several states to regulate the unmanned aircraft.

Varied legislation involving drones was introduced this year in more than 40 states, including Ohio. Many of those bills seek to regulate law enforcement’s use of information-gathering drones by requiring search warrants. Some bills have stalled or are still pending, but at least six states now require warrants, and Virginia has put a two-year moratorium on drone use by law enforcement to provide more time to develop guidelines.

Domestic drones often resemble the small radio-controlled model airplanes and helicopters flown by hobbyists and can help monitor floods and other emergencies, survey crops and assist search-and-rescue operations. But privacy advocates are worried because the aircraft can also carry cameras and other equipment to capture images of people and property.

“Right now police can’t come into your house without a search warrant,” said Ohio Rep. Rex Damschroder, who has proposed drone regulations. “But with drones, they can come right over your backyard and take pictures.”

Since 2006, the Federal Aviation Administration has approved more than 1,400 requests for drone use from government agencies and public universities wanting to operate the unmanned aircraft for purposes including research and public safety. Since 2008, approval had been granted to at least 80 law enforcement agencies.

But the FAA estimates that as many as 7,500 small commercial unmanned aircraft could be operating domestically within the next few years. A federal law enacted last year requires the FAA to develop a plan for safely integrating the aircraft into U.S. airspace by September 2015.

Damschroder’s proposed bill would prohibit law enforcement agencies from using drones to get evidence or other information without a search warrant. Exceptions would include credible risks of terrorist attacks or the need for swift action to prevent imminent harm to life or property or to prevent suspects from escaping or destroying evidence.

The Republican said he isn’t against drones but worries they could threaten constitutional protections against unreasonable searches and seizures.

“I don’t want the government just going up and down every street snooping,” Damschroder said.

The Ohio House speaker’s office says it’s too soon to comment on the chances for passage. But similar legislation has been enacted in Florida, Tennessee, Idaho, Montana, Texas and Oregon.

The sponsor of Tennessee’s bill said the law was necessary to ensure that residents can maintain their right to privacy.

“Abuses of privacy rights that we have been seeing from law enforcement recently show a need for this legislation,” said Republican Sen. Mae Beavers.

Beavers and Damschroder modeled their bills after one signed into law this year by Florida Gov. Rick Scott, who said then that “we shouldn’t have unwarranted surveillance.”

But the industry’s professional association says regulating law enforcement’s use of unmanned aircraft is unnecessary and shortsighted. It wants guidelines covering manned aircraft applied to unmanned aircraft.

“We don’t support rewriting existing search warrant requirements under the guise of privacy,” said Mario Mairena, government relations manager for the Arlington, Va.-based Association for Unmanned Vehicle Systems International.

The association predicts unmanned aircraft systems will generate billions of dollars in economic impact in the next few years and says privacy concerns are unwarranted.

In Maine, Gov. Paul LePage vetoed the state’s drone-regulating legislation, saying “this bill steps too far” and would lead to lawsuits and harm Maine’s opportunities for new aerospace jobs. He plans to establish guidelines allowing legitimate uses while protecting privacy.

The American Civil Liberties Union supports legislation to regulate drone use and require search warrants, but it would also like weapons banned from domestic drones and limits on how long drone-collected data could be kept, said Melissa Bilancini, an ACLU of Ohio staff attorney.

In North Dakota, Rep. Rick Becker’s bill to ban weapons from drones and require search warrants failed, but the Republican says he plans to try again because “we must address these privacy concerns.”

Democratic Rep. Ed Gruchalla, formerly in law enforcement, opposed Becker’s bill out of concern it would restrict police from effectively using drones.

“We are familiar with drones in North Dakota, and I don’t know of any abuses or complaints,” he said.

Drones can be as small as a bird or have a wingspan as large as a Boeing 737, but a program manager with the International Association of Chiefs of Police says most law enforcement agencies considering unmanned aircraft are looking at ones weighing around 2 pounds that only fly for about 15 minutes.

“They can be carried in the back of a car and put up quickly for an aerial view of a situation without putting humans at risk,” Mike Fergus said, adding that they aren’t suited for surveillance.

Medina County Sheriff Tom Miller in northeast Ohio says his office’s 2-pound drone is intended primarily for search-and-rescue operations and wouldn’t be used to collect evidence without a warrant.

Cincinnati resident Dwan Stone, 50, doesn’t have a problem with some limits.

“But I don’t oppose drones if there is a good reason for using them,” she said.

Chase Jeffries, 19, also of Cincinnati, opposes them.

“I don’t want the government being able to use drones to spy on people,” he said.



Senate Bill Could Further Delay UAS Integration


by Press • 5 August 2013

By Woodrow Bellamy III


FAA’s goal of integrating unmanned aircraft systems (UAS) into the National Airspace System by 2015 could be delayed by a provision in the Senate’s 2014 transportation bill.

The bill requires Transportation Secretary Anthony Foxx to submit a detailed report on UAS integration’s privacy impact to lawmakers, prior to allowing FAA to issue final regulations on the integration of UAS into the NAS. The Senate’s privacy provision tasks the agency with yet another hurdle to clear in the already time consuming process of UAS integration, which requires complex technological analysis of how these aircraft will behave within civil airspace and fly in the same airspace as commercially operated passenger aircraft, among other measures. UAS industry experts, such as Ben Gielow, government relations manager and general counsel for the Association for Unmanned Vehicles Systems International (AUVSI), are concerned that the provision will further delay integration, as most industry advocates already believe FAA will miss its goal of integration by 2015. Integration will allow UAS to be operated across a wide variety of applications, including law enforcement and border control, weather monitoring, oil and gas exploration, disaster management and aerial imaging. -


See more at:


Unmanned Aircraft Demonstration Highlighted at Agronomy Field Day

by Press • 7 August 2013


One of the most exciting recent developments in agriculture is the use of small Unmanned Aircraft Systems to evaluate crop conditions.  Producers, agronomists and the public can view a demonstration of this new technology at the 2013 Agronomy Field Day on Aug. 16 at Kansas State University’s Agronomy North Farm.

The use of these unmanned aircraft systems equipped with aerial optical sensing technology has gained national press attention in recent months.  Kevin Price, K-State, professor of agronomy and geography, and Deon van der Merwe, associate professor ofveterinary medicine and head of the diagnostic laboratory toxicology section, are among the leading researchers in the nation on this technology.  They are working to develop systems that can be used by the agricultural community.

“We’ve had an incredible reception among consultants, producers, plant breeders and others when we’ve shown them what this technology can do.  It has the potential to make their jobs much easier and will help them make better recommendations in a more timely manner,” Price said.

“At the field day, I will be showing some of the kinds of things we can accomplish with these small Unmanned Aircraft Systems in field demonstrations,” he added.

This is just one of several new technologies featured at the field day by agronomy researchers.  Other demonstrations will show the work of Dave Mengel, professor and soil fertility specialist on optical sensors for nutrient recommendations; Vara Prasad, associate professor and crop physiologist on stress tolerance research; and Peter Tomlinson, assistant professor and environmental quality specialist and Chuck Rice, university distinguished professor of agronomy on techniques used to measure greenhouse gases.

The field day will begin with registration at 9 a.m. and wrap up at 2 p.m.  There is no charge to attend, and a complimentary lunch will be available.  Preregistration is requested so that a lunch count can be made. Those interested in attending can preregister by calling Troy Lynn Eckart at 785-532-5776. To preregister online, see:

Sessions include two concurrent one-hour tours in the morning, starting at 9:45 and 11 a.m. After lunch, there will be demonstrations on sUAS flights; analyzing and interpreting images from sUAS technology; and field checking of optical sensing readings for crop nutrient status.

In addition, there will be displays from commercial companies and K-State researchers in the shed near the registration area, along with the crop garden, forage garden, and weed garden for browsing.  Extension specialists will be available to answer questions.

For more information, interested persons can contact Dallas Peterson at 785-532-0405 or


Is the unmanned grass truly greener on the civil side?


by Keven Gambold • 9 August 2013

It has been an ‘interesting’ couple of years for Unmanned Experts (UMEX), a small team of ex-military UAS/RPAS operators who had spread their wings into the civilian sector. Initially a US-based company, personnel were hand-picked from retiring USAF, US Army, British Army and Royal Air Force to offer consultancy, training and managed services based on their substantial operational UAS expertise. The time seemed right to move into ‘civi street’: the rumors swirling around the uniformed crewrooms were ones of endless contracts, too few companies and a yearning need. This article is designed to give a ‘peek behind the curtain’ into the current civil UAS/RPAS scene.


From a CONUS-perspective, the culture shock was considerable: the far-reaching International Trade in Arms Regulations (ITAR) effectively prevents US-based firms from even discussing most UAS-based topics with non US-persons, especially when outside of the country. An enduring lack-of commitment by the Federal Aviation Authority (FAA), despite Congressional mandate to the contrary, has brought all commercial UAS operations in US airspace to a halt. If you read about a real-estate firm using unmanned aircraft to photograph properties, then that is likely illegal, and the FAA has prosecuted a number of such enterprises. This FAA reticence has been exacerbated by a set of knee-jerk State-sponsored Privacy legislations (40 out of 50 States have attempted to enact some form of restrictive regulations) which severely ham-strings Law Enforcement use of the platforms, and a number of UAVs currently sit on the shelves at Sheriff’s offices across the Nation. The still substantial DoD market is justifiably competitive especially with Big Government suffering its own ‘credit crunch’. Not much to rejoice about here.


So ‘change or die’ became the watchword, and UMEX started new companies in both England and Australia, where the respective CAA and CASA aerospace agencies have positively embraced UAS integration efforts (the UK currently has nearly 300 registered civilian commercial operators). Outside of the Missile Technology Control Regime (MTCR limits data exchange on larger platforms i.e. 500kgs over 300km) there are few restrictions on providing international UAS services. Note that European or International airspace integration is woefully behind also, but some countries have ‘gone it alone’ to considerable success.


Unmanned Experts UK Ltd, drawing on over 25,000 hours of cutting-edge UAS operations and instruction, produced a comprehensive set of ground school courses ranging from one-day Career Workshops to 4-week UAS Maintenance Courses. Accreditation is an oft used, but little understood, concept but UMEX Courses are approved by George Mason University for CEUs, and we are tied to Southampton and Lincoln Universities in the UK. Since then, UMEX has run courses in the UK, US (for ASPRS amongst others) and Singapore with more planned for Turkey, the Middle East, India and Africa.


There is a growing demand for Consultancy and Managed Services in far-flung places as the utility of inexpensive SUAS comes to bear on a number of industries: open-cast mine mapping in Australia, anti-poacher patrols in Namibia, oil platform security in Iraq and search (& rescue) operations in Canada to name a few. An undercurrent of ‘good’ stories is emerging across the globe to show the true potential that UAS could bring to everyday lives. UMEX is fielding requests from an ever diversifying client base, and the future looks much brighter.


Bottom Line: the civil market is coming to life and the grass is sprouting, but in the US there’s more watering required.


Keven Gambold

Chief Operations Officer

Unmanned Experts




Washington Post to be sold to Jeff Bezos

Washington Post

By Paul Farhi, Monday, August 5, 4:33 PM


The Washington Post Co. has agreed to sell its flagship newspaper to founder and chief executive Jeffrey P. Bezos, ending the Graham family’s stewardship of one of America’s leading news organizations after four generations.

Bezos, whose entrepreneurship has made him one of the world’s richest men, will pay $250 million in cash for The Post and affiliated publications to the Washington Post Co., which owns the newspaper and other businesses.

Seattle-based Amazon will have no role in the purchase; Bezos himself will buy the news organization and become its sole owner when the sale is completed, probably within 60 days. The Post Co. will change to a new, still-undecided name and continue as a publicly traded company without The Post thereafter.

The deal represents a sudden and stunning turn of events for The Post, Washington’s leading newspaper for decades and a powerful force in shaping the nation’s politics and policy. Few people were aware that a sale was in the works for the paper, whose reporters have broken such stories as the Pentagon Papers, the Watergate scandals and disclosures about the National Security Administration’s surveillance program in May.

For much of the past decade, however, the paper has been unable to escape the financial turmoil that has engulfed newspapers and other “legacy” media organizations. The rise of the Internet and the epochal change from print to digital technology have created a massive wave of competition for traditional news companies, scattering readers and advertisers across a radically altered news and information landscape and triggering mergers, bankruptcies and consolidation among the owners of print and broadcasting properties.

“Every member of my family started out with the same emotion—shock—in even thinking about” selling The Post, said Donald Graham, the Post Co.’s chief executive, in an interview Monday. “But when the idea of a transaction with Jeff Bezos came up, it altered my feelings.”


Added Graham, “The Post could have survived under the company’s ownership and been profitable for the foreseeable future. But we wanted to do more than survive. I’m not saying this guarantees success but it gives us a much greater chance of success.”

The Washington Post Co.’s newspaper division, of which The Post newspaper is the most prominent part, has suffered a 44 percent decline in operating revenue over the past six years. Although the paper is one of the most popular news sources online, print circulation has dwindled, too, falling another 7 percent daily and Sundays during the first half of this year.

Ultimately, the paper’s financial challenges prompted the company’s board to consider a sale, a step once regarded as unthinkable by insiders and the Graham family itself.

With extraordinary secrecy, Graham hired the investment firm Allen & Co. to shop the paper, company executives said. Allen’s representatives spoke with a half-dozen potential suitors before the Post Co.’s board settled on Bezos, 49, a legendary tech innovator who has never operated a newspaper.

Bezos, in an interview, called The Post “an important institution” and expressed optimism about its future. “I don’t want to imply that I have a worked-out plan,” he said. “This will be uncharted terrain and it will require experimentation.”

He said, “There would be change with or without new ownership. But the key thing I hope people will take away from this is that the values of The Post do not need changing. The duty of the paper is to the readers, not the owners.”

Despite the end of the Graham family’s control of the newspaper after 80 years, Graham and Bezos said management and operations of the newspaper would continue without disruption after the sale.

Post publisher Katharine Weymouth—Graham’s niece and the fourth generation of her family involved in the newspaper—will remain as publisher and chief executive of the Bezos-owned Post; executive editor Martin Baron will continue in his job. No layoffs are contemplated as a result of the transaction among the paper’s 2,000 employees, who will be told of the sale at a company-wide meeting Monday afternoon.

Bezos said he would maintain his home in Seattle and would delegate the paper’s daily operations to its existing management. “I have a fantastic day job that I love,” he said.

In a note to Post employees on Monday, Weymouth wrote, “This is a day that my family and I never expected to come. The Washington Post Company is selling the newspaper that it has owned and nurtured for eight decades. “

The new owner of The Post may be as much a surprise as the decision to sell the paper in the first place.

Throughout his storied business career, Bezos has been an empire builder but hasn’t shown any evident interest in the newspaper business. He has, however, maintained a long friendship with Graham, and both men have informally advised the other over the years. Graham, for example, advised Bezos about how to feature newspapers on the Kindle, Amazon’s popular e-reader.

A computer science and electrical engineering student at Princeton, Bezos used his tech savvy to rise rapidly at a New York hedge-fund company, becoming its youngest senior vice president.

He founded Amazon at 30 with a $300,000 loan from his parents, working out of the garage in his rented home in Bellevue, Wash. He called his creation Amazon in part to convey the breadth of its offerings; early promotions called the site “Earth’s Biggest Bookstore.”

Since Amazon’s founding, Bezos has devoted himself to building it into a retail behemoth that sells everything from diapers to garden equipment to data storage at rock-bottom prices with a click of a mouse. It rung up $61 billion in sales last year.


In the process, Amazon has wreaked havoc on traditional brick-and-mortar stores. Many retailers have expressed dismay, and resentment, at Amazon’s ability to sell the same products at a lower price, in part because of its efficiency but also because it wasn’t collecting sales tax in most states.

For long periods, however, Bezos frustrated investors and analysts who wanted Amazon to turn profits more quickly, or more regularly. Because of heavy investments in warehouses and new businesses, Amazon didn’t deliver a profit until the company’s ninth year of operation, and seven years after selling shares to the public.

At times, Bezos has been openly disdainful of Wall Street’s demands for ever-rising quarterly profits. He told Fortune magazine last year, “The three big ideas at Amazon are long-term thinking, customer obsession, and willingness to invent.”

Under Bezos, the company’s drive into new businesses has been relentless. To supplement its line of Kindle readers and tablets, for example, Bezos pushed Amazon into book publishing itself, upsetting rivals like Barnes & Noble and book agents alike. (Bezos himself is an avid newspaper reader; in addition to The Post, he said he reads the New York Times and Wall Street Journal.)

But Amazon’s breakneck growth has also come with a few stumbles. Among other investments, Bezos bought a majority stake in in 1999 and paid $60 million for a portion of, a delivery service. Both companies went out of business. An attempt to compete with in online auctions wasn’t successful.

As such, an investment in Amazon comes with the likelihood of erratic earnings—and sometimes no earnings at all. The company lost $39 million last year.

Ultimately, however, Amazon has rewarded patient believers. Amazon’s sales have increased almost tenfold since 2004 and its stock price has quadrupled in the past five years. “We believe in the long term,” Bezos told Fortune, “but the long term also has to come.”

Friends and competitors have described Bezos as cerebral, demanding, curious, and given to asking challenging questions. He shows little tolerance for those who are poorly prepared, but can be charming and quick to laugh. “If Jeff is unhappy, wait five minutes,” his wife has said of him.

Bezos’ personal ventures have also given no hint of any interest in the news business. He started a private company called Blue Origin in 2000 to develop a space vehicle, and has acquired land in west Texas as a rocket launch site, both part of a lifelong passion for space travel. He is also reportedly spending $42 million to develop a clock inside a mountain in Texas that is designed to last 10,000 years—a symbol of Bezos’ business philosophy of thinking long-term.

In naming Bezos its “Businessperson of the Year” in 2012, Fortune called him “the ultimate disrupter…[who] has upended the book industry and displaced electronic merchants” while pushing into new businesses, such as TV and feature film production.

His drive and business creativity have earned him favorable comparisons to the late Steve Jobs, Apple’s co-founder and a confidant of Don Graham and his late mother, Post Co. chairman Katharine Graham. Earlier this year, Harvard Business Review ranked Bezos as the second best-performing chief executive in the world during the past decade, following only Jobs, who died in 2011.

In a message to employees on Monday, Don Graham quoted billionaire investor Warren Buffett, a longtime advisor to the Post Co., calling Bezos “the ablest CEO in America.”

Bezos’ reputation and smarts made him attractive as a buyer of The Post, said Weymouth. “He’s everything we were looking for—a business leader with a track record of entrepreneurship who believes in our values and cares about journalism, and someone who was willing to pay a fair price to our shareholders,” she said.


Weymouth said the decision to sell The Post sprang from annual budget discussions she had with Graham, her uncle, late last year. “We talked about whether [the Washington Post Co.] was the right place to house The Post,” she said. “If journalism is the mission, given the pressures to cut costs and make profits, maybe [a publicly traded company] is not the best place for The Post.”

Any buyer, she said, “had to share our values and commitment to journalism or we wouldn’t sell it.”

The sale to Bezos involves The Post and its website (, along with the Express newspaper, the Gazette Newspapers and Southern Maryland Newspapers in suburban Washington, the Fairfax County Times, the Spanish-language El Tiempo Latino newspaper, and the Robinson Terminal production plant in Springfield. Bezos will also purchase the Comprint printing operation in Gaithersburg, which publishes several military publications.

The deal does not include the company’s headquarters on 15th St. NW in Washington (the building has been for sale since February), or Foreign Policy magazine,, the, the WaPo Labs digital-development operation or Post-owned land along the Potomac River in Alexandria.

The Post, founded in 1877, has been controlled since 1933 by the heirs of Eugene Meyer, a Wall Street financier and former Federal Reserve official. Meyer bought the paper for $825,000 at a bankruptcy auction during the depth of the Depression.

After years of financial struggle, Meyer and his successor as publisher of The Post, son-in-law Philip L. Graham, steered the paper into a leading position among Washington’s morning newspapers. They began enlarging the company, notably by acquiring TV stations and Newsweek magazine in 1963 (the company sold the magazine for a nominal fee to the late billionaire Sidney Harman in 2010 after years of losses). In later years, the company added cable TV systems and the Kaplan educational division, currently the company’s largest by revenue.

Upon Graham’s death in 1963, his widow (and Meyer’s daughter) Katharine Graham took over management of the company. Despite her inexperience as a corporate executive, Mrs. Graham ably led the company through a colorful and expansive period.

The newspaper rose to national stature under Benjamin C. Bradlee, whom Katharine Graham had hired from Newsweek in 1965 as a deputy managing editor and promoted to editor in 1968. Bradlee oversaw the opening of new reporting bureaus around the nation and the world, started the Style section, and ignited the paper’s long run of Pulitzer Prize-winning reporting.

The Post’s and New York Times’ publication in 1971 of stories based on the Pentagon Papers—a secret government study of American military and political involvement in Vietnam—led to a landmark legal case in which the Supreme Court prohibited the government from exercising “prior restraint,” or pre-publication censorship, against the newspapers.

The arrest of seven men accused of breaking into the Democratic National Committee’s headquarters at the Watergate office complex in 1972 triggered the newspaper’s unearthing of a series of illegal activities orchestrated by President Nixon and his closest advisers. The revelations eventually led to Nixon’s resignation. The events were memorialized by the movie “All the President’s Men,” which turned The Post—as well as Bradlee and reporters Bob Woodward and Carl Bernstein—into household names.

Seven years after Nixon’s resignation, however, the paper suffered one of its darkest hours. It was forced to give back a Pulitzer Prize awarded to reporter Janet Cooke in 1981 after she admitted that her story about an eight-year-old heroin addict in Washington named Jimmy was a fabrication.

Katharine Graham, who died in 2001, was succeeded as Post publisher by her son, Donald, in 1979. He also succeeded her as chief executive of the Washington Post Co. in 1991.

During the 1990s and into the new century, under Bradlee’s successor, Leonard Downie Jr., the paper enjoyed arguably its most successful run in terms of profits, circulation and journalism. With little direct competition, the newspaper division’s revenue and profit soared. The Post won 25 Pulitzers under Downie, including six in 2008, the year he retired and was succeeded by Marcus Brauchli as editor.

The Grahams are among the last of a dwindling number of multigenerational family owners of metropolitan newspapers. Most major newspapers were once owned by local families with decades-long ties to their town or city, but that ownership profile has faded with succeeding generations and has largely disappeared in the Internet era.

Many of the heirs to great newspaper fortunes have sold their holdings to corporations or wealthy investors with little connection to the regions that the newspapers helped shape or, in some instances lately, to local businesspeople whose wealth was more recently acquired.

Over the past 20 years, the list of family-owned companies that have sold their newspapers holdings include the Chandlers (owners of the Los Angeles Times, among others), Cowles (Minneapolis Star Tribune), Copleys (San Diego Union-Tribune), and Bancrofts (Wall Street Journal).

The New York Times, controlled by the Sulzberger family, is among the last major dailies still operated by descendants of its early proprietor. It acquired The Boston Globe from members of the Taylor family in 1993 for $1.1 billion; it announced last week it was selling the paper for a mere $70 million to Boston businessman John W. Henry, a businessman who owns the Boston Red Sox.

Following the sale to Bezos, the Graham family will continue to control the renamed Washington Post Co. through its closely held stock, known as Class A shares. The A shares can’t be sold on the open market, but out-vote a second class of public stock, called Class B shares. The New York Times Co. has a similar stock structure, ensuring the Sulzbergers’ control.

Bezos, who ranks 11th on the Forbes 400 list of wealthiest individuals in America with a net worth of $23.2 billion, has given little indication of his ideological leanings over the years. He hasn’t been a heavy contributor to political campaigns, although he and his wife have regularly donated to the campaign of Sen. Patty Murray (D-Wash). In years past, they had given modest contributions to a handful of Republican and Democratic senators.

Bezos’ political profile rose suddenly and sharply when he and his wife agreed last year to donate $2.5 million to help pass a referendum that would legalize same-sex marriage in Washington State, catapulting them to the top ranks of financial backers of gay rights in the country. The donation doubled the money available to the initiative, which was approved last November and made Washington among the first states to pass same-sex marriage by popular vote.

Perhaps the single biggest item on Amazon’s legislative agenda is a bill that would empower all states to collect sales tax from online retailers.

Amazon is only required to collect sales taxes in states where it maintains a physical presence such as a warehouse. But Amazon now is supporting the bill, which has passed the Senate and is pending in the House. State sales taxes no longer pose a real threat to Amazon: With an emphasis on same-day shipping, the company is building distribution warehouses across the country and would have to pay the tax anyway. Last month, the company announced it would hire 5,000 employees at these warehouses, an ambitious growth strategy that is hurting profits in the short run.

Bezos’ most notable charitable donations have been twin $10 million contributions to two Seattle-based institutions, the Museum of History and Innovation and the Fred Hutchinson Cancer Research Center. The gift to the museum was for the creation of a center for innovation that would be situated a few blocks from a new Amazon headquarters campus.

Baron, the former editor of the Boston Globe who joined The Post as its editor in January, said he was surprised to learn last week that the newspaper was being sold.

But he added, “I’m encouraged that the paper will be in the hands of a successful business person who understands the world of technology as well as anyone. He’s expressed his commitment to the organization and to its continued independence…I came here because I wanted to join a great news organization, and it will continue to be one.”



OPM delays goal to eliminate pension claims backlog

Federal Times

Aug. 5, 2013 – 05:20PM | By SEAN REILLY | Comments


The Office of Personnel Management missed its goal last month of eliminating a backlog of new retirement claims and of processing most new retirees’ first full pension checks within 60 days.

The 7,724 new retirement claims processed last month marked an 11 percent drop from the June total and was down by almost half since February, when OPM processed more than 15,300 new retirement claims, according to statistics released by OPM Monday. The backlog of pending claims rose slightly last month to 25,601, up from 25,542 the previous month.

The drop reflects the continuing repercussions of a sequester-related budget crunch that forced OPM’s retirement services office to end employee overtime at the end of April.

In January 2012, then-OPM Director John Berry set a target for eliminating the existing claims backlog by last month and processing 90 percent of retirees’ first full pension checks with 60 days of leaving government service. At that point, OPM was taking more than 150 days to process the average new claim; that time has since fallen to about 90 days.

On Monday, OPM revised its schedule for achieving those goals. It now says it will eliminate the backlog by March and hit the 60-day processing target by next May on the assumption that the agency will be able to restart overtime for retirement services employees on Oct. 1 when the new fiscal year begins.

At present, however, it is far from certain whether Congress will approve the administration’s request for an 8 percent funding increase for OPM’s retirement services division in fiscal 2014.

In an interview Friday, Ken Zawodny, associate director for retirement services, could not say how the agency’s latest timetable to improve retirement claims processing will be affected if overtime cannot resume in October.

“I have to be optimistic that we will obtain our budget,” Zawodny said. He also said OPM has taken some steps on its own to improve performance, such as reorganizing retirement services employees, in order to cut the average claims processing time by 40 percent since the beginning of last year. In the updated plan, OPM cited U.S. Postal Service early-out programs that added more 20,000 claims to the agency’s workload a another reason for its failure to hit the 60-day processing goal.




Why Insiders, Not Hackers, Are the Biggest Threat to Cybersecurity



By Brian Fung

National Journal

June 10, 2013 0

The National Security Agency leaks by Edward Snowden will easily go down as one of the biggest revelations of the year, if not the decade. But the episode also raises new questions about the risk that insiders pose to government and corporate cybersecurity, in spite of the attention lavished on foreign hackers.

Snowden’s case is unique in that it uncovered a previously unknown surveillance apparatus that’s massive in size and scope.The way the whistle-blower did his deed, however, is not unique. Two-thirds of all reported data breaches involve internal actors wittingly or unwittingly bringing sensitive information to outsiders, according to industry analysts.

“It’s not an either-or proposition,” said Mike DuBose, a former Justice Department official who led the agency’s efforts on trade-secret theft. “But amidst all the concern and discussion over foreign hacking, what gets lost is the fact that the vast majority of serious breaches involving trade secrets or other proprietary or classified information are still being committed by insiders.”

DuBose is now the head of the cyber investigations unit at the risk-management firm Kroll Advisory Solutions. In February, his team authored a report warning that contractors, information-technology personnel, and disgruntled employees—all descriptors that fit Snowden pretty well—pose a greater threat than hackers, “both in frequency and in damage caused.”

Not everyone agrees. Even though insiders generally play an outsized role across all reported data breaches, their role in confirmed data breaches is rather small, according to an annual study by Verizon. In 2012, specifically, internal actors accounted for 14 percent of confirmed data breaches. Of those, system administrators were responsible for 16 percent.

“Our findings consistently show,” the Verizon report read, “that external actors rule.”

However common they are, cases like Snowden’s show how devastating one insider can be. The extent of the damage depends on what’s being exfiltrated and from where, and there aren’t many standards for calculating losses. Most companies estimate the value of their trade secrets based on how much money they sank into the research and development of that knowledge. But for the government, it’s the potential security impact that takes precedence—and that turns the question into a matter of subjective debate.

Last month, The Washington Post reported that Chinese spies compromised the designs for some of the Pentagon’s most sensitive weapons systems, including the F-35 Joint Strike Fighter, the V-22 Osprey tiltrotor aircraft, and the Navy’s new Littoral Combat Ship.

If true, the report could have major consequences for national security. But Snowden’s case is equally consequential, if for different reasons, and it bolsters DuBose’s point about the relevance of insiders. Snowden may have rightfully uncovered evidence of government overreach, but if a mid-level contractor can steal top-secret information about the NSA and give it to the public in a gesture of self-sacrifice, someone else could do the same—but hand the intelligence to more nefarious actors.



DHS Set to Tap New Cybersecurity Leader

Phyllis Schneck’s Road to Deputy Undersecretary

By Eric Chabrow, August 7, 2013. Follow Eric @GovInfoSecurity


Phyllis Schneck, the next deputy undersecretary for cybersecurity at the Department of Homeland Security, comes to the job with a different set of experiences than her predecessors – and that could prove valuable.

Schneck will leave her job as chief technology officer for the public sector at security provider McAfee to become the top cybersecurity policymaker at DHS, a senior DHS official confirms.

Unlike her predecessors, Schneck has a strong academic and research background, having earned a Ph.D. in computer science from Georgia Tech. She holds three patents in high-performance and adaptive information security and has six research publications in the areas of information security, real-time systems, telecom and software engineering. Before becoming McAfee’s public sector CTO, Schneck served as the company’s vice president of threat intelligence.

“Her underpinning background is in security, which gives her credibility,” says Chris Buse, chief information security officer for the state of Minnesota. “She is very poised and is an excellent communicator [who] has experience dealing with the political processes, something that is vital to this job.”

Schneck will take over the job Bruce McConnell has held on an acting basis since April, when Mark Weatherford resigned to join the consultancy The Chertoff Group [see DHS's Mark Weatherford Resign]. McConnell is leaving DHS at week’s end after four years as the department’s senior counselor on cybersecurity [see Another Senior Cybersecurity Leader to Exit DHS].


Awaiting Official Announcement

DHS has not officially announced Schneck’s appointment, first reported by The Hill newspaper. Schneck has not commented publicly on her new job. A spokeswoman for Schneck says she will not say “anything about any potential job change unless it were to be official.”

The appointment does not require Senate confirmation.

Schneck’s academic and research background reflects an out-of-the-box thinking about IT security. She champions, for example, developing IT security technology to emulate a human body with a strong immune system to battle infections.

“Just as your body defends against thousands of colds every year and you only maybe only get one, that’s what these systems are designed to do: push off the enemy and push off malicious traffic, without it having to have a name, and certainly without it having to have a signature – just understanding what is good or legitimate and what is not well intended or not needed and being able to defend against that and get smarter as you do so,” Schneck said in an interview with Information Security Media Group [see Infosec Evolution Mimics Immune Systems].


Hefty Curriculum Vitae

Her resume is filled with mostly private-sector jobs, but early in her career she worked at the NASA Goddard Space Flight Center and the University Of Maryland’s Department of Meteorology. Her corporate life has been consistently intertwined with government. As McAfee’s public sector CTO, she oversees the company’s technical vision for public-sector applications of security and global threat intelligence.

Schneck served as vice chair of National Institute of Standards and Technology’s Information Security and Privacy Advisory Board, a panel charged with identifying emerging managerial, technical, administrative and physical safeguard issues as they relate to cybersecurity and privacy. She was the working group co-chair for public-private partnership for the Commission on Cybersecurity for the 44th Presidency, a panel that produced a report that served as the outline for President Obama’s cyberspace policy.


“She has experience dealing with the political processes, something that is vital to this job,” Minnesota’s Buse says.

Dwayne Melancon, chief technology officer for risk-based security and compliance management solutions provider Tripwire, says Schneck’s experience should prove vital as DHS reconfigures itself. “Her technology background will be well-received by those in the information security industry – a kindred spirit is always more welcome,” he says.

And that spirit will be challenged as DHS implements Obama’s cybersecurity framework being developed by a public-private partnership under NIST’s auspices [ NIST Unveils Draft of Cybersecurity Framework ].


Major Challenge

Melancon says the public-private partnership is not fully realized, and that presents Schneck with one of her biggest challenges as she assumes her new role.

“This effort hasn’t yet delivered on its promise, and I believe our national security will pay the price if this isn’t solved,” Melancon says. “The problem isn’t the lack of capability – it’s our inability to actually get that capability delivered to the places where it’s most needed, which is securing our critical cyber-infrastructure in both the public and private sectors. … Dr. Schneck can’t just commission new efforts; she must also take a critical eye to existing programs and decommission those which aren’t delivering.”

Schneck comes aboard as the leadership within cybersecurity and infrastructure protection at DHS is in flux. Secretary Janet Napolitano, one of the administration’s chief spokespersons on cybersecurity, will leave in September, and the deputy secretary, Jane Holl Lute, resigned this spring. President Obama nominated Alejandro Mayorkas, director of DHS’s Citizenship and Immigration Services, to be deputy undersecretary, but his nomination is stalled in the Senate.

Rand Beers, undersecretary for the National Protection and Programs Directorate, is serving as DHS acting deputy secretary, and could become acting secretary if neither Mayorkas nor Napolitano’s successor is confirmed by the time she leaves. Beers will leave DHS when either the new secretary or deputy secretary is confirmed. Suzanne Spaulding, deputy undersecretary for infrastructure protection, has been nominated as undersecretary to replace Beers; she’s now serving as acting undersecretary. Schneck will report to Spaulding.


Making the Transition

Schneck’s presence at DHS could provide some stability to a department experiencing much tumult in its higher ranks in recent months [see Another Leadership Shakeup at DHS ]. But Eugene Spafford, executive director of the Center for Education and Research in Information Assurance and Security at Purdue University, wonders about Schneck’s adjustment to government bureaucracy, coming from McAfee, which he describes as a more diverse and security-focused environment than DHS.

“I’m not sure how much exposure she will have had to the kinds of pressures and trade-offs likely to be found in DHS, especially given some of the churn there in recent months,” Spafford says.

“Phyllis does like to get things done, so that could be a problem when inside DHS,” Spafford says. “Or, it could be a good thing – we will have to wait and see. If she is frustrated about getting things done, she’s not likely to stick around for a long time.”

If the Chinese army is trying to hack a Missouri water plant, what else is it infiltrating?


By Gwynn Guilford @sinoceros August 6, 2013


The question of whether the Chinese military is on a hacking offensive has largely been answered—and, despite Chinese government protestations, it sure looks like a pretty big “yes.” However, beyond the widely reported infiltration of foreign companies, the question of what else it’s hacking remains hazy.

But new research confirms one of the scarier possibilities: that the Chinese army isgoing after critical US infrastructure.

This all came out of a project by Kyle Wilhoit, a researcher at a security companycalled Trend Micro, that set up decoy “honeypots,” as decoy infrastructure systems are known, in 12 different countries. Wilhoit’s construction of a rural Missouri water plant honeypot attracted the notorious APT1, a crackerjack Chinese military hacking team sometimes known as Comment Crew, according to research he presented at the Black Hat security conference in Las Vegas. The group gained access in December 2012 through a Microsoft Word document containing malware unique to APT1.

“I actually watched the attacker interface with the machine,” he told the MIT Technology Review, referring to APT1. “It was 100% clear they knew what they were doing.”

Wilhoit noted to that systems like power grids and water plants are inherently vulnerable to hacking threats because they’re using outdated technology and don’t necessarily prioritize security. Of the “critical” honeypot attacks—meaning those that could have taken control of the system—half came from China. Examples include turning the pump’s water temperature to 130˚ Fahrenheit and shutting down the system.


This suggests that Chinese hackers are taking control of critical infrastructure. “These attacks are happening and the engineers likely don’t know,” Wilhoit told MIT Technology Review. Here’s a look at APT1′s past activity:



To date, the only publicly disclosed cyberattacks on US control systems have been on a factory and a state government building, both in New Jersey, reports MIT Technology Review. It’s unclear where those attacks originated.


Advice from the CIA: Keep your eye on your laptop…

Wed, 2013-08-07 05:14 PM

By: Jacob Goodwin

John Mullen, a longtime senior operations officer with the CIA, caught the attention of his audience at the SINET Innovation Summit in New York City on August 6 as he matter-of-factly recited the ways in which foreign intelligence services routinely steal industrial secrets and intellectual property from naïve traveling U.S. business people visiting their countries.

Mullen noted that advanced technologies have certainly helped these foreign intelligence services to steal valuable information from Americans (and other visiting travelers) but that their true advantage was what he dubbed “human frailty.” The naïve willingness of an ill-informed visitor to allow himself to be physically separated from his laptop, to willingly accept a computer file from his host, or to avail himself of computer services in the local market are the true Achilles Heels, Mullen suggested.

“Human frailty is often the weakest link in the chain,” he declared.

Mullen emphasized that unlike the U.S. Government, many foreign governments — he declined to name them — have identified as part of their national economic strategy a concerted effort to steal whatever intellectual property they can lay their hands on. To that end, these assertive governments “will manipulate your relationships and your friendships” to achieve their goals, he advised.

“When you’re on their turf,” said Mullen, “they own you.”

He said some foreign governments — as well as some foreign businesses, which work extremely closely with their own governments — will use mobile technologies to keep you under constant surveillance. “They’ll hot mic your cell phone,” said Mullen, “and they’ll track your movements.”

Some governments are not above blackmail to achieve their goals, he added. He told the group of cybersecurity professionals who gathered for a one-day conclave at Columbia University about one such instance he recalled. An employee of a U.S. software company was traveling on business overseas when he was seduced by a “femme fatale,” working for a foreign intelligence service. The U.S. traveler was filmed during his exploits and later blackmailed into sharing some of his proprietary information, for fear that the incriminating film would otherwise be made public.

Of course, said Mullen, people are still naively clicking on attachments to incoming email messages (some of which contain dangerous malware), and still beginning online relationships with people they’ve never met face-to-face. “Intelligence services don’t have to meet people to recruit them,” he warned. “They can meet them online.”

Mullen suggested that most of these techniques — and more — are practiced by many of the more-aggressive foreign governments. How does he know? Because he has practiced some of the same data gathering techniques himself over the years…on behalf of Uncle Sam.


Rasmussen Reports

What They Told Us: Reviewing Last Week’s Key Polls

Saturday, August 10, 2013


Who’ll be the next in line?

Hillary Clinton is the overwhelming favorite among Democratic voters for their party’s presidential nomination in 2016. New Jersey Governor Chris Christie holds a narrow lead among Republicans for the 2016 GOP nomination. 

But not so fast. Clinton was also the clear favorite in August 2005, three years before the 2008 Democratic National Convention. She subsequently lost in the primaries to Illinois Senator Barack Obama. Meanwhile, GOP voters who say Christie is the candidate they least want to see nominated outnumber those who support him by 10 percentage points.

History has shown us time and again that 2016 is not likely to end up as a matchup of the current front-runners, Clinton and Christie, Scott Rasmussen says in his latest weekly newspaper column. 

Democrats’ fortunes in 2016 – and in next year’s mid-term elections – are likely to be tied in large part to perceptions of how President Obama’s national health care law is working out.

Roughly half (48%) of all voters now want their governor to oppose implementation of the health care law in their state. Forty percent (40%) want their governor to support implementation. That’s a complete reversal from January when 47% wanted their governor to support implementation of the law and 39% were opposed.

The economy is sure to be in play in the upcoming national elections as well. Despite the continuing bad national job numbers, 52% of Americans believe it is still possible for anyone who really wants to work to find a job, but only a plurality (44%) now thinks it’s possible for just about anyone in the United States to work their way out of poverty

Sixty-five percent (65%) of working Americans consider themselves members of the middle class, but only a third of all Americans believe the economy is fair to the middle class

At the same time, consumer and investor confidence remain near their highest levels in several years. 

But  just 27% of voters say the country is heading in the right direction

Job approval ratings for the president himself have settled back to the levels seen during most of his first term in office after a surge just before – and for three or four months after – Election Day last November. 

Gun control was one of the top issues on the president’s agenda at the first of the year, but support for tougher anti-gun laws is down from just after the elementary school massacre in December. Americans now are evenly divided when asked if the United States needs stricter gun laws. 

When the National Rifle Association proposed armed guards in schools rather than more gun control, it was mocked by many in the media. But 62% of Americans with children of elementary or secondary school age would feel safer if their child attended a school with an armed security guard.

The president faced a couple of foreign policy flare-ups this past week, so we asked voters what they thought about them.

Following Obama’s decision to snub Russian President Vladimir Putin during an upcoming trip to Europe, U.S. voters’ views of Putin have worsened, but most think it’s unlikely the United States and Russia will reenter a Cold War period. Views of Russia are unchanged.

The Obama administration opted to temporarily close embassies and consulates in several Middle Eastern countries because of terrorist threats. Fifty percent (50%) of voters think U.S. involvement in Middle East politics is bad for America

Yet while 49% now believe most Muslims around the world view the United States as an enemy, just 19% think American Muslims living in this country are treated unfairly because of their religion and ethnicity

Fifty-eight percent (58%) think the Muslim-American U.S. Army officer now being tried for killing 13 of his fellow soldiers at Fort Hood, Texas should be sentenced to death if convicted, but that’s consistent with attitudes toward other suspected mass killers in recent years.

Republicans now hold a three-point lead over Democrats on the Generic Congressional Ballot.  This is the largest gap between the two parties since mid-April.

In other surveys last week: 

– Most voters continue to think gaining control of the border is more important than legalizing the status of undocumented workers already living in the United States. Only 32% believe that if a woman comes to the country illegally and gives birth to a child here, that child should automatically become a citizen. That’s the lowest finding measured since November 2011.

– One-in-four Americans (25%) think mass transportation in this country is not as safe as it was 10 years ago. Most Americans seldom, if ever use mass transit, but they still tend to believe the government should back mass transit projects as long as they don’t lose money.  

Sixty-three percent (63%) of voters believe global warming is at least a somewhat serious problem, including 35% who consider it a Very Serious one.  

– Fifty-eight percent (58%) of Americans, when given the choice, still prefer to read a printed version of a newspaper over the online version. But the number of adults who prefer print newspapers is the lowest measured in several years of tracking. Just 24% say they buy a print copy of their local paper every day or nearly every day, down from 31% four years ago.  

Detroit Tigers slugger Miguel Cabrera and veteran catcher Yadier Molina of the St. Louis Cardinals are the early fan favorites for MVP in the American and National leagues respectively.

– With the second half of the Major League Baseball season underway, the Tigers are now the favorite to win the World Series. 


July 27 2013




Pentagon chief can’t offer hope in budget cuts

Updated: 3:45 a.m. Monday, July 22, 2013 | Posted: 3:44 a.m. Monday, July 22, 2013


The Associated Press


The audience gasped in surprise and gave a few low whistles as Defense Secretary Chuck Hagel
delivered the news that furloughs, which have forced a 20 percent pay cut on most of the military’s civilian workforce, probably will continue next year, and it might get worse.

“Those are the facts of life,” Hagel told about 300 Defense Department employees, most of them middle-aged civilians, last week at an Air Force reception hall on a military base in Charleston.

Future layoffs also are possible for the department’s civilian workforce of more than 800,000 employees, Hagel said, if Congress fails to stem the cuts in the next budget year, which starts Oct. 1.

On the heels of the department’s first furlough day, and in three days of visits with members of the Army, Navy, Air Force and Marine Corps, Hagel played the unenviable role of messenger to a frustrated and fearful workforce coping with the inevitability of a spending squeeze at the end of more than a decade of constant and costly war.

The fiscal crunch also lays bare the politically unpopular, if perhaps necessary, need to bring runaway military costs in line with most of the rest of the American public that has struggled economically for years.

“Everybody’s bracing for the impact,” Army Master Sgt. Trey Corrales said after Hagel spoke with soldiers during a quick stop at Fort Bragg, N.C.

Corrales’ wife, a military civilian employee, is among those furloughed, and they have cancelled their cable TV and started carpooling to work to save money.

“The effects of the economy have started to hit the military,” Corrales said. “It was late in coming to us.”

The furloughs have hit about 650,000 civilian employees but also have slowed health care and other services for the uniformed military, which has stopped some training missions and faces equipment shortages due to the budget shortfalls. Troops were told this month they will no longer receive extra pay for deployments to 18 former global hot spots no longer considered danger zones.

Troops already are facing force reductions, and the Army alone has announced plans to trim its ranks by 80,000 over the next five years.

Officials agree that the military has undergone cycles of expanding and shrinking of the force over generations. Hagel said this time is different, and worse, however, because of what he described as a “very dark cloud” of uncertainty hanging over the Pentagon as Congress considers whether to reverse $52 billion in spending cuts that are set to go into effect in 2014.

At the Naval Air Station in Jacksonville, Fla., Hagel told an estimated 100 civilians gathered in a bustling jet maintenance hangar that the military had not been prepared for the $37 billion in cuts that took effect this year, forcing the furloughs. While he said he was deeply sorry for the strain the crunch has put on families, he said he would not slash troops’ training or other readiness budgets any further to prevent huge gaps in national security.

“I’m sure you realize how disruptive the furlough is to our productivity. So I’m hoping that we’re not going to do it again next year,” Elizabeth Nealin, a research and engineering manager at the navy base’s fleet readiness center, told Hagel.

“Have you planned for a reduction in force?” Nealin asked bluntly.

Hagel said if the $52 billion cut remains in place, “there will be further cuts in personnel, make no mistake about that.”

“I don’t have any choice,” he said.

The spending cuts this year may feel more dramatic than in times past because of a vast growth in Defense Department personnel and equipment costs over the past decade, said Todd Harrison, a senior fellow at the Center for Strategic and Budgetary Assessments in Washington. But current spending levels are close to what they were in 2007, when the war in Iraq was at its peak.

“So we’re not even back to a pre-9/11 level,” he said.

Since 2000, the number of U.S. troops has grown by about 3 percent to fight the wars in Iraq and Afghanistan, Harrison said. But the number of civilian Defense employees hired to support the troops has far surpassed that, growing by 14 percent in the same time.

Hagel said he is taking a hard look at where fat can be trimmed from the Pentagon and said the military has been “guilty of wasting a lot of money on a lot of things.” But he also said he “can’t lead this institution based on hope, based on I think, or based on maybe” — and predicted more dollar cuts ahead.

In Charleston, where the hopeful crowd quickly turned worried, Sandra Walker pointedly asked Hagel what might be in store for her job security, retirement benefits and security clearances if the shortfalls continue.

“I’ve taken a second job to compensate, because I have several children at home,” said Walker, who works in education and training at a medical clinic on base. “And if we are going to have future furloughs, will those things be taken into consideration for the future of our jobs?”

Sticking to his message, and stopping short of directly answering her question, Hagel offered little hope.

“There’s no good news,” he said.


Sandy-ravaged regions will never get landlines back

By Katie Lobosco @KatieLobosco July 22, 2013: 6:06 AM ET


Last fall, Superstorm Sandy wiped out landline telephone service for thousands of people. Many of them are never getting those landlines back.

Verizon (VZ, Fortune 500) is still in the process of repairing the telephone infrastructure that was damaged by the massive storm in late October. But in many cases, the telecom giant is replacing the old copper-based systems with new technologies — including wireless.

Those changes are coming for the industry as a whole, whether or not telecom giants like Verizon and AT&T (T, Fortune 500) want them to. And they were coming long before Sandy struck. The parts needed to repair the old landline technology are hard to find, sending companies to some odd places to purchase equipment, such as eBay (EBAY, Fortune 500).

“It can’t be that our critical infrastructure is relying on eBay for replacement parts,” said Bob Quinn, head of AT&T’s regulatory affairs.

Manufacturers that once made the required components, such as Nortel and Lucent, have gone out of business or been bought out, noted Danielle Coffey, a vice president at the Telecommunications Industry Association.

“It’s not only eBay, there’s a whole secondary market for these parts,” she said.

Related story: are landlines doomed?

That’s because landlines are a dying business. Many customers have switched to cell phones or VoIP services like Microsoft’s (MSFT, Fortune 500) Skype to make calls. More than 36% of Americans use cell phones as their only telephone service, about ten times the rate from a decade ago, according to a Centers for Disease Control study.

Still, many telephone customers in Sandy-ravaged areas are displeased about the prospect of losing their landlines.

On Fire Island, N.Y., off the southern coast of Long Island, Verizon is replacing its copper landlines with a wireless telephone system called Voice Link. The new system consists of a small modem-sized device that plugs into an electrical outlet and a standard telephone jack in your wall at home. That device connects to Verizon’s wireless cellular network, which brings phone service and a dial tone to the existing cord or cordless phones in the home. Customers can use it to make calls, and it and offers services like call waiting, caller ID and voice mail.

But, at least for now, Voice Link can’t connect customers to the Internet. That means medical alert services often used by senior citizens will not work. Those kinds of systems allow a customer to press a medical alert button immediately contacting a monitoring center. Alarm services, fax machines, and DSL Internet won’t work either.

Related story: AT&T isn’t nearly as bad as you think

Hundreds of Fire Island residents have filed complaints with the New York Public Service Commission about the service.

“It’s not quite ready for prime time,” said Harold Feld, the senior vice president of Public Knowledge, an advocacy group that opposes the all-wireless Voice Link system. “If we do switch to wireless as an alternative, then we want this wireless alternative to be as good or better than what we have now.”

Verizon offers the only telephone service on the island, so the hundreds of residents of the popular vacation spot have little choice but to accept Verizon’s Voice Link plan. Verizon said it intends to improve the system as time goes on.

Fire Island is a “unique situation,” said Tom Maguire, the senior vice president for national operations at Verizon. Wireless is not the only path forward for swapping out copper lines for new technology. AT&T and Verizon in many cases are replacing copper with fiberoptic cable and upgrading their networks from a series of routers and switches to a modern digital network.

Related story: Femtocell hack reveals mobile phones’ calls, texts and photos

But Fire Island isn’t the only place where Verizon is installing Voice Link. The company began working on the Voice Link system well before the storm, testing it in places like Florida and Virginia as a way to connect customers without having to repair existing copper lines.

In areas other than Sandy-ravaged communities, Maguire said, the Voice Link system will be available as an option — and not every customer is a suitable candidate. It is for people who do not want DSL Internet service and do not have services like alarm and life support systems. And if a Verizon technician goes to a home to install Voice Link and there is weak cellular network signal, Verizon would in that case repair the copper wire instead of installing the wireless system.

Verizon has deployed Voice Link in Mantoloking, N.J., which was also heavily damaged by Hurricane Sandy, and the company has plans to deploy it in areas like the Catskills where the copper infrastructure is badly damaged.


Future Carriers Built to Carry Drone Fleets


by KRIS OSBORN on JULY 19, 2013


Navy planners have anticipated the recent historic steps forward the Navy has taken toward outfitting the decks of their carriers with fleets of unmanned drones by designing future and current carriers to support the technological advances these aircraft will present, officials said.

The U.S. Navy’s new Ford-class aircraft carriers are engineered with the ability to accommodate more carrier-launched unmanned aircraft systems similar to the X-47B that landed on the deck of the USS George H.W. Bush July 10, service officials explained.

The Ford-class carriers are built with a series of technological advances compared to their predecessors — to include a slightly larger flight deck, upgraded nuclear power plants, dual-band radar, improved landing gear and vastly increased on-board electrical capacity to include a new electromagnetic propulsion system for aircraft taking off the deck, said Rear Adm. Thomas J. Moore, Program Executive Officer, Carriers.

“The Ford Class will be around until about 2110. The flight deck has been designed to be bigger and have a higher sortie generation rate. The ship itself is built with three-times the electrical generating capacity than the Nimitz {Ford predecessor} class has – so it is not hard to envision that we are going to be flying unmanned aircraft off that ship,” said Moore.

Citing the recent historic touchdown of the X-47B demonstrator aircraft aboard the USS George H.W. Bush, Moore said the Ford-class carriers are engineered with a specific mind to next-generation aviation and ship-based technologies.

The Ford-class of carriers are being built with emerging technological trends in mind and the expected increase in unmanned systems and electrically-generated weapons systems.

Moore said that if you look at the kind of aircraft which initially flew on a Nimitz-class carrier when they first emerged in the 1970s, they are very different than what is flying on those carriers today. In fact, the Joint Strike Fighter aircraft will fly on the Nimitz-class carriers before they retire, he said.

“Unmanned aircraft will certainly be part of our portfolio moving forward – they will not replace manned aircraft but will play an important role.”

The USS Ford is slated to enter the water at a christening ceremony in November of this year and begin formal service by late 2016. It is the first-in-class in a planned series of next-generation Ford-class aircraft carriers designed to replace the current Nimitz-class carriers on a one-for-one basis over roughly the next 50 years.

The USS Gerald R. Ford (CVN 78), now nearing completion, will be followed by the second and third Ford-class carriers, the USS John. F. Kennedy (CVN 79) to enter service by 2025 — and the USS Enterprise (CVN 80), slated to enter service by 2027.

The Ford-class carriers will have four 26 megawatt electrical turbine generators, designed in part to power key systems on the ship, including dual-band phased array radar and the Electro-Magnetic Aircraft Launch System, or EMALS – put on carriers to replace the current steam catapults for aircraft on the flight deck.

“On a Nimitz-class carrier, outside of the propulsion plant we use a lot of steam to run the catapults and heat the water. We made a conscious decision on the Ford class to better electrify the ship,” Moore explained.

Moore also emphasized that the electrical backbone and capacity of the Ford-class carriers will better equip the ships to accommodate directed energy weapons in the future, should they be added to the ship.

For example, it is conceivable that directed energy or laser weapons might compliment the defense systems currently in place to defend the ship such as the Phalanx Close-in-Weapons-System, Rolling Air Frame Missile and NATO Sea Sparrow, Moore explained.

“The Ford has huge margins of ability to generate electrical power that no other ship has,” he said.

In fact, increased automation, computer technology and electrical capacity will reduce man-power requirements on-board the ship, dramatically increasing capability and lowering life-cycle costs, said Mike Petters, President and Chief Operating Officer, Huntington Ingalls Industries.

Moore explained that the technology-inspired man-power reductions will result in as much as $4 billion in savings over the 50-year life-cycle of the ship.

Petters and Moore explained how the Ford-class carriers are designed with a slightly smaller island to allow for more deck space, thus increasing the ship’s ability to launch and recover larger numbers of aircraft.

“You have created an electrical distribution system that is going to allow for lower maintenance cost – then you have advanced arresting gear and the radars. You are really talking about a ship that has substantially more capability. The flight deck was all part of how do we get more sorties – changing the location of the footprint is all about how do you get the flight deck more efficient,” Petters said.

One analyst said that increasing the ability to project power at greater distances through the increased use of unmanned aircraft on carriers, is exactly how the Navy should be thinking about its future.

“The aircraft carrier is relevant today and it will be relevant for decades,” said Bryan McGrath, managing director at FerryBridge Group LLC, a defense consulting firm based in Easton, Md.

In particular, longer reach or operating ranges — for strike possibilities and Intelligence, Surveillance and Reconnaissance missions – is likely to grow in importance in light of what Pentagon strategists refer to as Anti-Access/Area-Denial, or A2/AD, challenges, he argued.

A2/AD is a strategic approach to current and future conflict based upon the premise that potential adversaries have increased technological capability to challenge the U.S. military’s ability to operate in certain areas in an uncontested manner – such as closer to shore.

“Sortie generation rate as a virtue will decline in importance in the years to come largely because any opponent of worth will hold us a little further off shore. Sortie generation rose in importance as we came to dominate the oceans. Now we may have a near-peer competitor, so what we really need is range from the wing – the ability to operate from further away and bring strike power,” McGrath said.

The increased sortie-generation rate capability with the Ford-class carriers is designed to increase the flexibility to launch manned and unmanned systems with greater ease and frequency, a Navy official said.

“The deck has been built to provide the air wing of the future with greater flexibility,” the official said.

Nevertheless, any efforts to increasingly configure aircraft carriers to accommodate increased ability to house and launch longer-range platforms, including manned and unmanned systems, is something McGrath would like to see more of.

“The Navy should begin thinking about designing an aircraft carrier that is devoted to the launch and recovery of unmanned aviation,” McGrath said. “It will need to do its job for 50 years, so you have to think about what you get. You get a very powerful symbol and the means for the delivery of American power. There is no substitute in our arsenal.”

Moore said the Ford-class of carriers are being built with a mind to long-term service – an approach which has, by design, engineered the ship with growth potential such that it can accommodate emerging technologies as they arise.

“Big-deck carriers are by far what we need in terms of power projection and presence. For a lot of the missions we want and the presence we have around the world, there’s nothing like it. Why are the Russians, Indians and Chinese building a carrier? Countries know that carriers bring something to the table that nothing else can bring in terms of an instrument of national power. There is a reason we build these things,” Moore added.

Read more:


DOE study: Fracking chemicals didn’t taint water

Jul 19, 5:48 PM EDT


Associated Press

PITTSBURGH (AP) — A landmark federal study on hydraulic fracturing, or fracking, shows no evidence that chemicals from the natural gas drilling process moved up to contaminate drinking water aquifers at a western Pennsylvania drilling site, the Department of Energy told The Associated Press.

After a year of monitoring, the researchers found that the chemical-laced fluids used to free gas trapped deep below the surface stayed thousands of feet below the shallower areas that supply drinking water, geologist Richard Hammack said.

Although the results are preliminary – the study is still ongoing – they are the first independent look at whether the potentially toxic chemicals pose a threat to people during normal drilling operations. But DOE researchers view the study as just one part of ongoing efforts to examine the impacts of a recent boom in oil and gas exploration, not a final answer about the risks.

Drilling fluids tagged with unique markers were injected more than 8,000 feet below the surface at the gas well bore but weren’t detected in a monitoring zone at a depth of 5,000 feet. The researchers also tracked the maximum extent of the man-made fractures, and all were at least 6,000 feet below the surface.

That means the potentially dangerous substances stayed about a mile away from surface drinking water supplies, which are usually at depths of less than 500 feet.

“This is good news,” said Duke University scientist Rob Jackson, who was not involved with the study. He called it a “useful and important approach” to monitoring fracking, but he cautioned that the single study doesn’t prove that fracking can’t pollute, since geology and industry practices vary widely in Pennsylvania and across the nation.

The boom in gas drilling has led to tens of thousands of new wells being drilled in recent years, many in the Marcellus Shale formation that lies under parts of Pennsylvania, New York, Ohio and West Virginia. That’s led to major economic benefits but also fears that the chemicals used in the drilling process could spread to water supplies.

The mix of chemicals varies by company and region, and while some are openly listed the industry has complained that disclosing special formulas could violate trade secrets. Some of the chemicals are toxic and could cause health problems in significant doses, so the lack of full transparency has worried landowners and public health experts.

Over the last four years the debate over fracking chemicals has attracted tremendous attention from state and federal agencies, public health experts, and opponents of fracking. Yet while many people have focused on the potential threat from the chemicals, experts have come to believe that more routine aspects of the drilling process are more likely to cause problems. Poor well construction that allows excess gas to escape, spills of chemicals or other fluids that take place at the surface, and disposal of wastewater are all issues of concern.

Jackson said most of the problems that the Duke researchers have seen have been related to well construction, not fracking chemicals.

The study done by the National Energy Technology Laboratory in Pittsburgh marked the first time that a drilling company let government scientists inject special tracers into the fracking fluid and then continue regular monitoring to see whether it spread toward drinking water sources. The research is being done at a drilling site in Greene County, which is southwest of Pittsburgh and adjacent to West Virginia.

Eight Marcellus Shale wells were monitored seismically and one was injected with four different man-made tracers at different stages of the fracking process, which involves setting off small explosions to break the rock apart. The scientists also monitored a separate series of older gas wells that are about 3,000 feet above the Marcellus to see if the fracking fluid reached up to them.

The industry and many state and federal regulators have long contended that fracking itself won’t contaminate surface drinking water because of the extreme depth of the gas wells. Most are more than a mile underground, while drinking water aquifers are usually close to the surface.

Kathryn Klaber, CEO of the industry-led Marcellus Shale Coalition, called the study “great news.”

“It’s important that we continue to seek partnerships that can study these issues and inform the public of the findings,” Klaber said.

While the lack of contamination is encouraging, Jackson said he wondered whether the unidentified drilling company might have consciously or unconsciously taken extra care with the research site, since it was being watched. He also noted that other aspects of the drilling process can cause pollution, such as poor well construction, surface spills of chemicals and wastewater.

Jackson and his colleagues at Duke have done numerous studies over the last few years that looked at whether gas drilling is contaminating nearby drinking water, with mixed results. None has found chemical contamination but they did find evidence that natural gas escaped from some wells near the surface and polluted drinking water in northeastern Pennsylvania.

Scott Anderson, a drilling expert with the Environment Defense Fund, said the results sound very interesting.

“Very few people think that fracking at significant depths routinely leads to water contamination. But the jury is still out on what the odds are that this might happen in special situations,” Anderson said.

One finding surprised the researchers: Seismic monitoring determined one hydraulic fracture traveled 1,800 feet out from the well bore; most traveled just a few hundred feet. That’s significant because some environmental groups have questioned whether the fractures could go all the way to the surface.

The researchers believe that fracture may have hit naturally occurring faults, and that’s something both industry and regulators don’t want.

“We would like to be able to predict those areas” with natural faults and avoid them, Hammack said.

Jackson said the 1,800-foot fracture was interesting but noted it is still a mile from the surface.

The DOE team will start to publish full results of the tests over the next few months, said Hammack, who called the large amount of field data from the study “the real deal.”

“People probably will be looking at the data for years to come,” he said.

On Friday, DOE spokesman David Anna added that while nothing of concern has been found thus far, “the results are far too preliminary to make any firm claims.”



Draft of a House Bill Restricting FAA Licensing of UAVs Unveiled

by Press • 25 July 2013


A House member from Vermont unveiled the draft of a bill that would require those seeking to operate an unmanned aerial vehicle over U.S.-airspace to provide detailed information about the use of the UAV before receiving a federal license to fly the drone.

Rep. Peter Welch, D, unveiled the draft document that would prohibit the Federal Aviation Administration from issuing a drone license unless the application includes a statement that lists who will operate the drone, where the drone will be flown, what kind of data will be collected, how that data will be used, whether the information will be sold to third parties and the period for which the information will be retained.

The draft would also require the police applicants, as well as their contractors and subcontractors, to provide a “data minimization statement” that says how data collected by the UAV that is unrelated to the investigation of a crime will be minimized. The draft also requires the FAA to create a website listing all approved UAV licenses, data collection and data minimization statements, any data security breaches suffered by a licensee and the times and locations of flights.

In addition, under the draft bill, even if a police agency had a license to operate a drone, the agency would still have to obtain a warrant to fly the UAV except in special circumstances such as search and rescue operations and the imminent danger of death or serious injury.

“As drones emerge in the airspace over the United States, I am deeply concerned that American’s constitutional right to privacy and their right to know are protected,” Welch said. “Drones can be a valuable tool for law enforcement, public safety and other commercial uses, but it is essential that the law be updated to cover this technology and to ensure individual privacy and transparency in its operation and use,” he said.

Welch made the draft of the bill public at Chittenden County, Vt. on June 12, but he has not yet introduced the bill in the House.


The Creepy, Long-Standing Practice of Undersea Cable Tapping

The Atlantic

July 10 2013

By Olga Khazan


In the early 1970′s, the U.S. government learned that an undersea cable ran parallel to the Kuril Islands off the eastern coast of Russia, providing a vital communications link between two major Soviet naval bases. The problem? The Soviet Navy had completely blocked foreign ships from entering the region.

Not to be deterred, the National Security Agency launched Operation Ivy Bells, deploying fast-attack submarines and combat divers to drop waterproof recording pods on the lines. Every few weeks, the divers would return to gather the tapes and deliver them to the NSA, which would then binge-listen to their juicy disclosures.

The project ended in 1981, when NSA employee Ronald Pelton sold information about the program to the KGB for $35,000. He’s still serving his life prison term.

The operation might have ended, but for the NSA, this underwater strategy clearly stuck around.

In addition to gaining access to web companies’ servers and asking for phone metadata, we’ve now learned that both the U.S. and the U.K. spy agencies are tapping directly into the Internet’s backbone — the undersea fiber optic cables that shuttle online communications between countries and servers. For some privacy activists, this process is even more worrisome than monitoring call metadata because it allows governments to make copies of everything that transverses these cables, if they wanted to.

The British surveillance programs have fittingly sinister titles: “Mastering the Internet” and “Global Telecoms Exploitation,” according to The Guardian.

A subsidiary program for these operations — Tempora — sucks up around 21 million gigabytes per day and stores the data for a month. The data is shared with NSA, and there are reportedly 550 NSA and GCHQ analysts poring over the information they’ve gathered from at least 200 fiber optic cables so far.

The scale of the resulting data harvest is tremendous. From The Guardian:


This includes recordings of phone calls, the content of email messages, entries on Facebook and the history of any internet user’s access to websites — all of which is deemed legal, even though the warrant system was supposed to limit interception to a specified range of targets.

In an interview with online security analyst Jacob Appelbaum, NSA leaker Edward Snowden called the British spy agency GCHQ “worse than” the NSA, saying it represents the first “full take” system, in which surveillance networks catch all Internet traffic regardless of its content. Appelbaum asked Snowden if “anyone could escape” Tempora:

“Well, if you had the choice, you should never send information over British lines or British servers,” Snowden said. “Even the Queen’s selfies with her lifeguards would be recorded, if they existed.”

The U.S.’s own cable-tapping program, known by the names OAKSTAR, STORMBREW, BLARNEY and FAIRVIEW, as revealed in an NSA PowerPoint slide, apparently functions similarly to Tempora, accessing “communications on fiber cables and infrastructure as data flows past,” according to The Washington Post. The slide indicates that Prism and these so-called “upstream” programs work together somehow, with an arrow saying “You Should Use Both” pointing to the two operations.


So how does one tap into an underwater cable?

The process is extremely secretive, but it seems similar to tapping an old-fashioned, pre-digital telephone line — the eavesdropper gathers up all the data that flows past, then deciphers it later.


A map of undersea cables. (TeleGeography)

More than 550,000 miles of flexible undersea cables about the size of garden watering hoses carry all the world’s emails, searches, and tweets. Together, they shoot the equivalent of several hundred Libraries of Congress worth of information back and forth every day.

In 2005, the Associated Press reported that a submarine called the USS Jimmy Carter had been repurposed to carry crews of technicians to the bottom of the sea so they could tap fiber optic lines. The easiest place to get into the cables is at the regeneration points — spots where their signals are amplified and pushed forward on their long, circuitous journeys. “At these spots, the fiber optics can be more easily tapped, because they are no longer bundled together, rather laid out individually,” Deutsche Welle reported.

But such aquatic endeavors may no longer even be necessary. The cables make landfall at coastal stations in various countries, where their data is sent on to domestic networks, and it’s easier to tap them on land than underwater. Britain is, geographically, in an ideal position to access to cables as they emerge from the Atlantic, so the cooperation between the NSA and GCHQ has been key. Beyond that partnership, there are the other members of the “Five Eyes” — the Australians, the New Zealanders, and the Canadians — that also collaborate with the U.S., Snowden said.

The tapping process apparently involves using so-called “intercept probes.” According to two analysts I spoke to, the intelligence agencies likely gain access to the landing stations, usually with the permission of the host countries or operating companies, and use these small devices to capture the light being sent across the cable. The probe bounces the light through a prism, makes a copy of it, and turns it into binary data without disrupting the flow of the original Internet traffic.

“We believe our 3D MEMS technology — as used by governments and various agencies — is involved in the collection of intelligence from … undersea fibers,” said a director of business development at Glimmerglass, a government contractor that appeared, at least according to a 2010 Aviation Week article, to conduct similar types of interceptions, though it’s unclear whether they took part in the British Tempora or the U.S. upstream programs. In a PowerPoint presentation, Glimmerglass once boasted that it provided “optical cyber solutions” to the intelligence community, offering the ability to monitor everything from Gmail to Facebook. “We are deployed in several countries that are using it for lawful interception. They’ve passed laws, publicly known, that they will monitor all international traffic for interdiction of any kind of terrorist activity.”

The British publication PC Pro presented another theory: that slightly bending the cables could allow a receiver to capture their contents.

One method is to bend the cable and extract enough light to sniff out the data. “You can get these little cylindrical devices off eBay for about $1,000. You run the cable around the cylinder, causing a slight bend in cable. It will emit a certain amount of light, one or two decibels. That goes into the receiver and all that data is stolen in one or two decibels of light. Without interrupting transfer flow, you can read everything going on on an optical network,” said Everett.

The loss is so small, said Everett, that anyone who notices it might attribute it to a loose connection somewhere along the line. “They wouldn’t even register someone’s tapping into their network,” he added.

Once it’s gathered, the data gets sifted. Most of it is discarded, but the filters pull out material that touches on one of the 40,000 search terms chosen by the NSA and GCHQ — that’s the content the two agencies inspect more closely.

The British anti-surveillance group Privacy International has filed a lawsuit against the U.K. government, arguing that such practices amount to “blanket surveillance” and saying that British courts do “not provide sufficiently specific or clear authorization for such wide-ranging and universal interception of communications.” Their argument is that the existing surveillance laws are from the phone-tapping days and can’t be applied to modern, large-scale electronic data collection.

“If their motivation is to catch terrorists, then are there less intrusive methods than spying on everyone whose traffic happens to transverse the U.K.?” said Eric King, head of research at Privacy International.

Meanwhile, the British agency, the GCHQ, has defended their practices by saying that they are merely looking for a few suspicious “needles” in a giant haystack of data, and that the techniques have allowed them to uncover terrorist plots.


If groups like Privacy International are successful, it may put an end to the capture of domestic Internet data within the U.K., but as NSA expert Matthew Aid recently told me, since 80 percent of the fiber optic data flows through the U.S., it wouldn’t stop the massive surveillance operations here or in other countries — even if the person on the sending end was British.

It’s also worth noting that this type of tapping has been going on for years — it’s just that we’re now newly getting worked up about it. In 2007, the New York Times thus described President Bush’s expansion of electronic surveillance: “the new law allows the government to eavesdrop on those conversations without warrants — latching on to those giant switches — as long as the target of the government’s surveillance is ‘reasonably believed’ to be overseas.”

Want to avoid being a “target” of this “switch-latching”? A site called “Prism-break” recently released a smorgasbord of encrypted browsing, chat, and email services that supposedly allow the user to evade government scrutiny.

The only platform for which there is no encrypted alternative is Apple’s iOS, a proprietary software, for which the site had this warning:

“You should not entrust neither your communications nor your data to a closed source device.”


Pentagon Factors Operational Energy into Acquisition


by KRIS OSBORN on JULY 25, 2013

The Pentagon has taken steps to formally integrate “energy efficiency” as a metric in the requirements process it uses for procurement and weapons development, senior officials said.

Energy performance has now become a mandatory key performance parameter in a part of the Pentagon’s developmental process known as Joint Capabilities Integration Development System, or JCIDS, said Sharon E. Burke, Assistant Secretary of Defense for Operational Energy Plans and Programs.

The JCIDS process, a collaborative analytical look at acquisition and development, is used to establish and refine requirements for procurement programs, often paving the way forward to next-step milestones in the process.

“We’ve become a force that uses a lot more energy than it used to,” she said. “We’re building energy efficiency into modernization. We have a long way to go because with a lot of the platforms that are entering the acquisition process — their fuel consumption is quite a bit higher. We’re increasing our fuel demand.”

In particular, Burke mentioned increasing power demands for next-generation electronics, ships, aircraft, weaponry and developmental items such as the Navy’s prototype Laser Weapon System that depends upon large “pulses” of energy to operate.

There are more than 300 operational energy initiatives across the Department of Defense, Burke explained. About $1.6 billion was spent on these programs in fiscal year 2013 and, depending upon budget developments, as much as $2 billion is expected for fiscal year 2014.

The Pentagon office for Operational Energy Plans and Programs was stood up by the National Defense Authorization Act of 2009 in response to requests from fielded forces, members of Congress and Pentagon leaders.

One analyst described this effort as a bit of a double-edge sword, indicating that this effort may bring both great rewards and also run the risk of adding too many requirements to an already taxed procurement process.

“On one hand, you are looking across the entire force and doing an in-depth analysis. This effort can bring lower costs, better performance, improved operational flexibility and a reduced logistics tail — which can save lives,” said Daniel Goure, vice president of the Lexington Institute, a Va.-based think tank.

“At the same time, are we now adding more requirements, more oversight and more reviews to a process that many believe is already too long and too cumbersome. Will this add complexity to getting stuff out the door?”

Also, some members of Congress have at times criticized the military’s operational energy platform, pushing back on various “green” efforts on the grounds that some of them may raise costs. Also, many members have raised questions about costs with regard to specific programs such as the Navy’s use of biofuels, an effort to power the fleet using alternative fuels.

Overall, the drawdown in Afghanistan means forces and Forward Operating Bases are more distributed or dispersed and the “re-balance” to the Asia-Pacific underscore the unyielding appetite for greater energy efficiency in combat circumstances and across increasingly greater distances, Burke explained.

While saving money by increasing energy efficiency remains a huge part of the calculus in today’s budget environment, the tactical and logistical advantages provide an edge on the battlefield, Burke explained.

“Anti-access/Area denial means that the supply chain is fully in play in the battlefield. That is going to be true going forward,” she said. “How do you build energy performance into the future force, which will have much bigger fuel requirements and much more sophisticated anti-access challenges? What are your options for making energy an advantage rather than a limiting factor?”

Considering these dynamics and the need for longer-dwell intelligence, surveillance and reconnaissance and longer-range strike capability necessitated by A2/AD, energy considerations are a key part of the equation when it comes to Pacific re-balance and the stepped up development of unmanned systems across the services.

“Unmanned systems give you a totally different way of looking at energy security. Not only is there much lower fuel costs but you can be a little more experimental with the way you power them,” Burke said.

Meanwhile, supply lines, fuel and energy efficiency have proven to be of paramount importance during the land wars in Iraq and Afghanistan, Burke added.

For example, fewer convoys needed to deliver fuel to remote outposts in Afghanistan due to increased efficiency results in both decreased expenditures and logistical footprints.

At the same time, greater fuel efficiency for ships, UAS and aircraft will help offset what many refer to as the “tyranny of distance” – the vast geographical expanse known to the Pacific theater.

Building energy efficiency parameters more formally into the process will help weapons and program developers perform the needed integration earlier in the developmental process, thus reducing costs and risks typically associated with the acquisition process.

As a result, Burke and other senior Pentagon officials have been working with combatant commanders and service acquisition personnel to work on the integration for this effort.

“You want to be able to put a value on better energy performance, so you need to know the value of a technology in operation. What does it cost you do support that technology? What is the unit cost? The only way you can know this is if you have the right analysis to bring to the process,” Burke said.

The Pentagon has already had some successes with the development and implementation of energy-efficient emerging technologies across the services. The effort spans a wide range of technologies from small, portable solar-powered blankets and lightweight batteries for the Army to hybrid-electric Navy amphibious assault ships and much more in between, Burke explained.

In addition, one key example of the approach to build energy efficiency more formally into the acquisition process is found in the ongoing procurement of the Air Force’s Combat Rescue Helicopter, a new helicopter program designed to replace the service’s currently serving HH-60.

“In the RFP [Request for Proposal] we were looking for better energy performance. It will be a criteria in the contract,” Burke explained.

Read more: 


B-52 CONECT: A reboot for the Digital Age

Posted 7/26/2013 Updated 7/25/2013 Email story Print story

by Airman 1st Class Joseph Raatz

Air Force Global Strike Command Public Affairs


7/26/2013 – BARKSDALE AIR FORCE BASE, La. (AFNS) — One thing is certain: it’s not your father’s B-52.

The B-52 Stratofortress has been the Air Force’s star long-range strategic heavy bomber and premier standoff weapon for more than 50 years. For generations, the B-52 has successfully executed missions all over the globe.

But in the 21st century, the pace of things has accelerated beyond the wildest dreams of the original designers who first put plans for the aircraft on the drawing board more than 60 years ago.

“Things change so quickly now, that you simply can’t take 20- to 30-hour-old data into the fight with you any longer,” said Alan Williams, the deputy program element monitor at Air Force Global Strike Command.

With digital display screens, computer network servers and real-time communication uplinks, the B-52 of the future will be far removed from the final batch that was delivered to Minot Air Force Base, N.D., in 1962.

The Combat Network Communications Technology, or CONECT, program will help thrust the B-52 into the modern era.

“Now the crews will be able to do final mission planning enroute,” Williams said. “They will be able to get targeting updates; they will be able to get intelligence updates, all while they are en route so that they can get the most current data.”

The beyond line of sight, or BLOS, communications ability introduced in the CONECT upgrades will allow for a central air operations center to pass along updated threat and targeting data to the aircraft for rapid machine-to-machine retargeting, rather than having the crew and mission be dependent solely upon information that was available at take-off.

“The aircraft will be much more effective and safer for the crew because of being able to receive those threat and target updates,” Williams said, adding that CONECT will also allow the aircrew to receive last-minute updates so that they are able to strike the most current or necessary targets and do it rapidly because of the new machine-to-machine targeting capability.

CONECT also brings an unprecedented networking ability to the B-52.

“It provides us with a digital backbone so that we can pass data all the way around the aircraft,” Williams said, explaining that with the upgrades, any data available to one crew member will now be available to every other member instantaneously via the new digital displays at each workstation.

These new upgrades will provide a foundation that may help guarantee the aircraft’s viability through the remainder of its life span, which is currently projected to extend beyond 2040.

“Now when we add additional systems to the aircraft at some future date, we will be going from a digital component, across our new digital backbone, to another digital component elsewhere in the aircraft,” Williams said. “In the future, it will make upgrades easier to do because we’ll already have that digital infrastructure in the aircraft.”

Williams summed up the CONECT upgrades by saying they would help convert the B-52 from an analog aircraft to a digital platform for today’s warfighter.

“It is taking the B-52 from a rotary-dial phone to a smartphone,” Williams said.

With the CONECT upgrades in place, the B-52 will be well-equipped to enter the Digital Age. In doing so, “the aircraft” will continue to be an adaptable powerhouse for decades to come.


Carter, Winnefeld to Brief House Panel on SCMR Findings


Jul 26, 2013

By Marcus Weisgerberin


Two top Pentagon officials are set to appear before House lawmakers next week to discuss the findings of the Strategic Choices and Management Review, an effort that will help determine how the US Defense Department operates amid budget cuts.

Deputy Defense Secretary Ashton Carter and Adm. Sandy Winnefeld, vice chairman of the Joint Chiefs of Staff, are scheduled to appear before the House Armed Services Committee on Aug. 1. Pentagon officials have yet to give a detailed look inside SCMR, which defense insiders have nicknamed the “skimmer” or “scammer.”

Lawmakers have been pressing DoD for several months to give them insight into the review, which began in March. The project is supposed to tee up decisions for programs, missions and troop level changes depending on the level of spending cut levied on the Pentagon.

Defense Secretary Chuck Hagel in July told senators that the SCMR “seeks to adhere … to the tenets” of DoD’s Pacific-focused Defense Strategic Guidance.


S&T Community Evaluating Funding Gaps For FY-15 POM Planning

The office of the Air Force’s chief scientist has been working through a budget exercise focused on identifying gaps between funding and priority science and technology investment areas.

The Air Force’s former chief scientist, Mark Maybury, released a forward-looking report earlier this summer titled “Global Horizons” that looks at opportunities, threats and emerging capabilities that the service’s science and technology community needs to address now and over the next two decades. Mica Endsley, the service’s new chief scientist who assumed her role in June, told Inside the Air Force after an Air Force Association event in Arlington, VA, on July 11, that her office has been using the work compiled in Global Horizons to identify which priorities are funded and where the service may need to shift its focus to invest in these capabilities.

“One thing we’ve just gone through and done is an analysis of where the gaps are for the [fiscal year 2015 program objective memorandum] to identify where we need to make some strategic changes to try and address these high-priority items,” Endsley said. “It’s really trying to direct our current efforts to make sure that we’re making these trade-offs.”

Endsley would not discuss where some of the shortfalls lie but said that some of the service’s plans laid out in the “Global Horizons” report and identified as near-term priorities — to be accomplished between FY-13 and FY-17 — are not currently budgeted for.

“Some of these things were already slated and in the budget, so they’re already marching ahead,” Endsley said. “There are other areas where we may want to redirect funds.”

One priority investment area involves the service rethinking its approach to and design of autonomous systems. Endsley said during the event that today’s automated systems lack human-machine teaming — an approach to automation that plays on the strengths of the machine to process and deliver data and the ability of the human operator to take that data and use it to make decisions.

“One of the things we’ve found in the past is that you can’t just throw automation at a problem and solve it,” Endsley said. “In fact, you can make a problem worse if you’re not very careful about how you do that. So what we really need to focus on are effective, synergistic teamings of people and automated agents in order to make this more successful.”

Endsley said that traditional automated systems have left humans out of the loop with very little understanding of the processes a machine is performing, which makes the operator less likely to recognize when a problem arises and respond to it in a timely manner. In many cases, she said, the reason for inaction is that the operator lacks full awareness of the situation because the processes aren’t transparent or are hard to understand — even for people who are actively trying.

“We’ve found that this type of automation puts people at a disadvantage because it makes you a passive processor of information as opposed to actively processing,” Endsley said. “And when you’re a passive processor, you don’t really understand what’s going on.”

The solution to this, she said, is an automated system that is trustworthy and serves as an aid to a process. Endsley told ITAF that one area where the service needs to improve its automation is data exploitation. The service’s command-and-control systems collect large amounts of very detailed information, she said, but its processes for integrating and distributing that data needs to be improved.

“We have to be able to integrate it. We have to be able to process video data, for example, and recognize what key information is happening,” she said. “That’s rapidly going to increase the ability to have detailed manual analysis.”

This new approach to automation will require both modernizing existing systems and creating new autonomous systems. She told ITAF that, even in the near term, the service is going to need to take a dual-minded approach to the issue.

She added that one problem the science and technology community faces in this and other research areas is that the Air Force invests so much of its budget in operations and maintenance. It will take an initial investment to reshape and expand the service’s use of automated systems, she said, but these systems will save the service money and time in the long term.

“If we can improve the efficiency of some of these systems, then we can do a lot to reduce these costs,” Endsley told ITAF. “We can do it with two airmen instead of 10 airmen to get a particular job done. I think that’s something we have to take a good look at.” — Courtney Albon



What They Told Us: Reviewing Last Week’s Key Polls

Rasmussen Reports

Bottom of Form

Saturday, July 27, 2013


President Obama attempted this week to shift the nation’s attention away from what he calls “phony” scandals and back to the economy.

He is again pushing Congress to prime the pump with more spending, but most voters (62%) continue to think the government should cut spending in reaction to the nation’s economic problems. Not that there’s much optimism: During budget negotiations in the spring, just 15% thought any plan agreed to by the president and Congress would really cut federal spending.

The economy continues to send off mixed signals. The Rasmussen Consumer and Investor Indexes which measure daily confidence are both down from recent highs but still are well ahead of where they’ve been in previous years since the Wall Street meltdown. 

Homeowners continue to express optimism about the housing market. Only 12% now think the value of their home will go down over the next year. That’s the lowest finding since Rasmussen Reports began regular tracking on this question in April 2009.  Just over half of U.S. homeowners still say their home is worth more than what they owe on it.

Yet while 55% of Americans think the institution of marriage is Very Important to U.S. society, they see that institution at risk in the current economy. Only 29% believe that in a family with children, it is good for both parents to work full-time.

A federal judge on Wednesday cleared the way for Detroit’s bankruptcy to move ahead, and 74% of Americans think other major cities may soon be following the same path to bankruptcy. Twenty-five percent (25) think the federal government should provide bailout funding for cities with serious financial problems.

Just 26% of Likely U.S. Voters now say the country is heading in the right direction. That’s down from a high of 43% the week just before Election Day and the lowest level of confidence since mid-January 2012.

Only 17% think U.S. public schools now provide “a world-class education,” down from 26% in August 2011 when the president first set this as the goal to achieve. But then just 25% think most high school graduates have the skills needed for college. Only 22% believe most of these graduates have the necessary skills to go into the workforce.

The president’s daily job approval ratings have tumbled from the euphoric highs surrounding his reelection to the levels found for most of his first term.

As roll out of the president’s national health care law stumbles along, voters continue to give high marks to the health care they now receive but are more pessimistic than ever about the short-term future of the health care system in this country. Just 24% now expect the U.S. health care system to get better over the next couple of years.

Obama and his supporters may characterize the Internal Revenue Service’s targeting of Tea Party and conservative groups as a “phony” scandal,” but 59% of voters still think it’s likely the president or his top aides were aware of what the IRS was up to.   Sixty-six percent (66%) believe the decision to target the groups came out of Washington, DC, with 26% who think it was made by someone at IRS headquarters and 40% who think it was decided by someone at the White House. Belief in the Washington connection is unchanged from a month ago.

In an era when there is increasing skepticism about crony capitalism, voters are more critical of the president’s policies toward small business and continue to believe he favors big business instead

Like responses to the George Zimmerman verdict itself, voters give the Obama administration mixed reviews for its reaction to the outcome of the Trayvon Martin case. But voters share the president’s concern about racial profiling. 

Still, only 24% of Americans believe Zimmerman’s actions which led to the shooting death of Martin were motivated primarily by racism. Just 21% believe he should now be charged with a hate crime by the U.S. Department of Justice.

Voters overall continue to believe the U.S. justice system is fairer to blacks and Hispanics than it is to Americans in general.  Forty-six percent (46%) of all voters think the U.S. justice system is generally fair to black and Hispanic Americans, but 87% of black voters disagree. Voters are almost evenly divided when asked if the U.S. justice system is fair to most Americans: 43% say yes, 41% no, and 16% are not sure. Just 36% say the system of justice in this country is fair to poor Americans

Forty-four percent (44%) of Americans agree with the jury’s verdict that Zimmerman was not guilty in the shooting death of Martin. Thirty-five percent (35%) disagree, including 80% of black Americans.

But 52% of all voters think if all of the other facts were the same as those in the Zimmerman/Martin case, the jury would have found a black shooter not guilty of murder if the victim was white.  Thirty-one percent (31%) disagree and believe the jury would have found the black shooter guilty of murder. Among blacks, however, 79% think the jury would have found the black shooter guilty. Sixty percent (60%) of whites believe the jury would have reached the same not guilty verdict. 

Just 19% of all voters believe it’s a good idea for states, cities, organizations and individuals to boycott Florida over the Zimmerman verdict

Most voters (51%) do not know if their state is one of the many that has a stand your ground self-defense law. Voters favor such a law by a 45% to 32% margin, but given the lack of awareness on the subject, opinion is unlikely to be settled at this time.

In other surveys last week:

– Democrats have regained their lead over Republicans on the Generic Congressional Ballot.

– Eighty-one percent (81%) of voters consider a candidate’s personal life at least somewhat important to how they will vote. Fifty-seven percent (57%) say they are less likely to vote for a candidate who has been involved in a sexual scandal.

Even before the latest news of sexual “texting” by New York City mayoral candidate Anthony Weiner broke this week, half of the city’s registered voters had an unfavorable opinion of him. Fifty-one percent (51%) have an unfavorable view of Eliot Spitzer who resigned as governor of New York in 2008 following exposure of his regular use of prostitutes. Spitzer is now running for city comptroller, New York’s chief financial officer.

– Forty-eight percent (48%) of Likely U.S. Voters believe research by private businesses seeking a profit will do more to meet the U.S. need for affordable energy sources than research by the government. Thirty-eight percent (38%) believe government research is the better way to go.

– Following outrage over Rolling Stone magazine’s decision to put a sympathetic picture of the Boston Marathon bomber on its cover, 62% of Americans think the media pay too much attention to the personal lives of violent criminals.

Most Americans see Rolling Stone’s decision to put the Boston suspect on its cover as a publicity stunt, but they recognize that the magazine has a right to publish what it chooses.

July 20 2013




DHS’s Napolitano Resigns: The Impact

Secretary Leaving to Head University of California System

By Eric Chabrow, July 12, 2013. Follow Eric @GovInfoSecurity


The resignation of Janet Napolitano as homeland security secretary could have an adverse impact on the nation’s cybersecurity policy, at least temporarily, considering the posts of deputy secretary and deputy undersecretary for cybersecurity remain vacant.

“It is never good when leadership positions are vacant, as there is a loss of momentum for any initiatives under way,” says Karen Evans, who served as de facto federal chief information officer in the George W. Bush White House.

Napolitano announced on July 12 that she is leaving as homeland security secretary to become chancellor of the University of California system. She plans to leave the department by early September, according to a letter she sent to staff.

Deputy Secretary Jane Holl Lute and Deputy Undersecretary for Cybersecurity Mark Weatherford resigned this spring [see DHS's Mark Weatherford Resigning]. Unlike the secretary and deputy secretary positions, Weatherford’s former job does not require Senate confirmation. That means a candidate for that job could be named before Napolitano departs.

“They have candidates for all the positions and are very close to announcing selections,” says James Lewis, senior fellow at the Center for Security and International Studies, a Washington think tank. “All the selections are good.”


Meeting Milestones, Remaining Calm

Patricia Titus, the onetime chief information security officer at DHS’s Transportation Security Administration, says the acting officials are performing excellently in their temporary roles. “The career employees have been meeting milestones, remaining calm and carrying on with the work at hand,” Titus says. “Filling these key roles will be imperative, however, there certainly are still great people working hard.”

Rand Beers, undersecretary for national protection and programs, is acting deputy secretary; Bruce McConnell, senior counselor for cybersecurity, is filling Weatherford’s old job temporarily; and Bobby Stempfley, deputy assistant secretary, is the interm assistant secretary for cybersecurity and communications, replacing Michael Locatis, who resigned in January [see DHS Losing a Senior Cybersecurity Leader].

Jay Carney, the president’s press secretary, says he has no names to float for a Napolitano replacement. “The president will be very deliberate about looking at potential successors for that very important position,” he says.


Cybersecurity Focus

During Napolitano’s tenure, cybersecurity became a national priority and a core mission at DHS. Here’s how Napolitano put it in her third annual address on the state of homeland security delivered in February:

“There’s perhaps no better example of how DHS has evolved to address new and evolving threats than the issue of cybersecurity. The cyber realm wasn’t even a major focus of the early department. Now, it is one of our five core mission areas.”

Sen. Jay Rockefeller, the West Virginia Democrat who has cosponsored comprehensive cybersecurity legislation, worked closely with Napolitano, “particularly when few people were thinking about this issue. It took no time for her to understand the enormous consequences of inaction.”

Cybersecurity should play a significant role during confirmation hearings for Napolitano’s and Lute’s successors. Questions from senators could focus on the role of government and DHS in protecting the mostly privately owned national critical infrastructure.

“However, I don’t expect that any serious nominee will have trouble with these questions, in light of the conventional wisdom about the importance of the private sector leading the way,” says Allan Friedman, research director of the Brookings Institution’s Center for Technology Innovation.

Another line of questioning could focus on recent disclosures about National Security Agency programs collecting information on American citizens. “We can expect questions about how they will work with the defense and intelligence community, but I would be surprised if this became a serious political obstacle,” Friedman says.


A Top Spokesperson

Since her confirmation as secretary in 2009, Napolitano had become one of the administration’s top spokespersons on cybersecurity. That’s especially true in testimony before Congress, where because of executive privilege, the White House’s cybersecurity coordinator and other Obama advisers will not testify.

But being a face for administration cybersecurity policy doesn’t mean the policy is where it should be. “The big problems at DHS are lack of authorities and lack of a clearly defined mission,” CSIS’s Lewis says.

Brookings’ Friedman picks up on that theme, saying the absence of any clear strategic program in cybersecurity could be seen as a failure or a success.

“On one hand, there is no clear sign or accomplishment to demonstrate progress made in the last four years,” Friedman says. “On the other hand, the failure of large initiatives was largely a failure of legislation, and DHS has been instrumental at coordinating numerous initiatives across the federal government. In retrospect, small improvements and programs might be seen as better than a failed comprehensive plan.”


New Blood

Larry Clinton, chief executive of the industry group Internet Security Alliance, points out that Napolitano endorsed the administration’s earlier position that the government could regulate the cybersecurity of critical private businesses in certain instances, a position groups like his opposed and the White House eventually abandoned.

The administration’s new position, reflected in President Obama’s cybersecurity executive order and backed by Napolitano, would have the federal government, working with industry, develop cybersecurity best practices that critical infrastructure operators could voluntarily adopt, an initiative being carried out by the National Institute of Standards and Technology [see Man Behind the Cybersecurity Framework ].

“It may be a good thing to make the change now so a new person who is more committed to the new model can be brought on and help it through its current formative stage and remain in place as it is implemented,” Clinton says.

Day-to-day activities are not likely to be adversely affected by the vacancies, but high-level relations could, says Evans, who now heads the U.S. Cyber Challenge. “There are good career staff in place who will continue the administration’s priorities,” she says. “B But you do need the leadership there in the meetings with the DoD and intelligence community to ensure the balance of all the agencies as well as private sector.”.


Web industry officials balk at domain expansion plan

By Craig Timberg, Published: July 14


A plan to dramatically expand the number of possible Web addresses — by adding more than 1,000 new domains such as “.buy,” “.casino” and “.gay” — could cause widespread disruption to Internet operations, say some industry officials.

Efforts to augment existing domains such as “.com” and “.gov” have been underway for several years and are entering a critical new phase as industry officials meet at an international conference that began Sunday in Durban, South Africa. By summer’s end, the new domains could be going live at a pace of 20 or more each week.

The plan has touched off a scramble among investors eager to gain control of the virgin Internet real estate, potentially worth billions of dollars in annual licensing fees. But a vocal group of critics is calling the speed and scale of the expansion reckless, given its possible impact on the Internet’s global infrastructure, which relies on interactions among computer networks owned by companies, universities and individual users.

Particularly troubling is the possibility of widespread “name collisions” that could happen when domains used by internal corporate computer systems — such as “.corp” or “.home” — get assigned to the Web more broadly. This could cause systems to fail, blocking access to e-mail or other internal programs, and also could open sensitive information to theft, some experts say.

“This could affect a million enterprises,” said Danny McPherson, chief security officer for Verisign, which is based in Reston and manages several of the most popular existing domains. “It could absolutely break things.”

McPherson and other security experts say the nonprofit group that oversees the designation of Web addresses, the Internet Corporation for Assigned Names and Numbers (usually known by its acronym, ICANN), has not done enough study on the impact of the new domain names and does not have procedures in place to respond quickly if systems malfunction. Among those posing risk could be domains such as “.med” or “.center” that might be critical to the functioning of medical systems or emergency-response networks.

Similar concerns have been expressed by the Association of National Advertisers, which represents hundreds of major companies, and the Internet commerce site PayPal, which issued a letter in March saying, “The potential for malicious abuse is extraordinary, [and] the incidental damage will be large even in the absence of malicious intent.”


Defenders of the plan have called such fears overblown, arguing that the potential problems have been long understood and will be resolved before new domains are approved. Because the new domains will be released gradually, over the course of months, there will be time to manage problems as they arise, said Jeffrey Moss, chief security officer for ICANN.

“It’s not like it’s a runaway train without recourse,” Moss said. “We’re not going to do anything that harms the security or stability of the Internet.”

U.S. officials who oversee Web security issues through the Commerce Department’s National Telecommunications and Information Administration expressed confidence in the management of the domain program, issuing a statement saying, “We would expect these issues to be discussed and resolved within the ICANN multistakeholder process.”

Whoever wins control of the new domains will be allowed to sell licensing rights for the resulting new Web addresses, typically for annual fees, with a portion going to fund ICANN, which is based in Southern California. Just bidding for a domain costs $185,000.

Donuts Inc., an investment group that made the largest number of bids, with 307, said Verisign’s criticism of the process for launching the new domains was a result of self-interest. The company controls the popular “.com” and “.net” domains — giving it a degree of market power that could be diluted if new ones gain widespread acceptance.

“ICANN was created in large part to break Verisign’s monopoly over domain names,” Donuts spokesman Mason Cole said in a statement. “Now that the organization is on the verge of achieving that goal, it’s not surprising that Verisign is uncomfortable.”

Verisign officials say they support the program for adding new domains but believe the rollout should proceed more cautiously than currently planned.

The stakes are high in an era when a large and growing share of the world’s economic activity happens over the Internet. Even traditional brick-and-mortar businesses use online systems to communicate, manage inventories and interact with customers. Many also count on the security of networked computer systems to protect lucrative intellectual property and other valuable strategic information.

Moss, the ICANN security chief, acknowledged that some internal corporate systems will malfunction as new domains are created, and he said it would be the responsibility of company officials to resolve these problems.

“We want everything to work, and we’re going to try to make everything work, but we can’t control everybody’s networks on the planet,” he said.

Moss said the number of domains likely to cause problems is a “really, really small number.”

But critics have said it is irresponsible for ICANN to approve new domains before it knows the extent of the problems they would create and has plans in place to fix them. The cost of repairing systems — or the loss of security — would be borne by private companies that in most cases have little to gain from the hundreds of new Internet domains.

In addition to expressing such security concerns, corporate leaders have been complaining that the sheer number of new domains will cause a sharp rise in fraud and abuse as criminals buy up Web addresses intended to deceive consumers. Already, many companies are attempting to defend against this by acquiring many different Web addresses that include their corporate names. But that will become far more difficult, they say, with hundreds of new domains, including “.corp,” “.inc” and “.sucks.”

“If everything ran perfectly, this would extraordinarily transform the Internet,” said Dan Jaffe, executive vice president of the Association of National Advertisers. “There is every reason to believe that, as of now, there could be serious problems.”



Sign up today to receive #thecircuit, a daily roundup of the latest tech policy news from Washington and how it is shaping business, entertainment and science.


New Data Link Enables Stealthy Comms    

Defense News

Jul. 14, 2013 – 04:36PM |

By AARON MEHTA         


WASHINGTON — Pentagon officials have long identified the F-35 joint strike fighter as key to the future of America’s defense, in large part due to stealth capabilities that should allow the plane to travel in contested environments that older fighters would struggle to penetrate.

The problem is, these planes need to talk to each other without sacrificing stealth. To tackle that problem, the F-35 has incorporated Northrop Grumman’s Multifunction Advanced Data Link (MADL), a system that’s undergoing testing in the California desert.
MADL is a digital waveform designed for secure transmission of voice and data between F-35s, with the potential of linking F-35s to ground stations or other aircraft, Northrop said.

Think of the system as a computer. The communications, navigation and identification (CNI) system on an F-35 can manage 27 different waveforms, including MADL. The data comes through the antenna, is turned into digitized bits, and is crunched by the on-board systems to get the relevant information to the pilots.

The system will be included in the 2B software package that the US Marine Corps’ F-35B jump-jet variant and the US Air Force’s F-35A conventional version will use when they reach initial operating capability in 2015 and 2016, respectively. It also will be included in all international versions of the jet. The US Navy’s F-35C carrier variant is expected to reach IOC in 2019 with the block 3F software, which will incorporate MADL and other capabilities.

What makes MADL more than just a communications tool is its ability to connect with other planes and automatically share situational awareness data between fighters. The more planes in the network the greater the data shared and the more comprehensive a picture is formed.

Picture a group of jets flying in formation. The pilot farthest to the right will have a different situational awareness picture than the pilot on the left. But once they’re networked, all the information is automatically shared among the pilots.

Prior to takeoff, planes would be designated with partners to form the network. When a plane gets within range, the network is automatically created.

“Like on your computer, your network into the local area, we’re building that network in the sky and it’s keeping up with all the dynamics and spatial changes,” said Bob Gough, director of CNI technology and programs at Northrop. “MADL has the smarts to keep up with all of that and keep the network in place so they can share the same data.”

Gough declined to say how close jets need to be to trigger the network link, but did say tests have shown “very fast” acquisition times once within range.

Live flight system tests at Edwards Air Force Base, Calif., began late last year and have continued throughout this year. Initially, the tests involved networking a pair of planes, but recently, test pilots began regularly flying four-plane networks. Those tests are proceeding smoothly, said Joe DellaVedova, a spokesman for the Pentagon’s F-35 Joint Program Office.

“MADL testing is performing as planned,” DellaVedova wrote in an email. “Development of the advanced data link is currently tracking to deliver the phased capability expected by the end of development.”

The system is designed for plane-to-plane communications only, something Gough expects to continue in the near term. But he did not rule out experimenting with data transfer to other terminals.

“We have postulated MADL terminals on ships and we have built a MADL test ground station, so it could be done,” he said. “But it’s more about the logistics of where F-35s will be flying and how close to the ground they would be. It would be mission-scenario dependent, but it’s all technically possible.”

In the long term, Northrop hopes to expand the technology to other fifth-generation planes. That’s not a new idea; in 2008, MADL was slated to go on the F-22 Raptor fighter and B-2 bomber. But it never went on those jets, something the former Air Force chief of staff, Gen. Norton Schwartz, blamed on the technology’s lack of maturity during congressional testimony in 2009.

“We believe as the flight test program matures, it will be more likely” to end up on other platforms, Gough said.

That could include using MADL to communicate between fifth-generation fighters like the JSF and fourth-generation fighters, such as an F-16. Gough said he hopes to begin research on fifth-to-fourth generation data transfers “as soon as” next year.



Where’s the Best Place to Put a Wind Farm?

Scientific American

Sunday, July 14, 2013 | 7


If you want to generate electricity from the sun or wind, it makes sense to go where it’s sunniest and windiest. But part of the reason to generate electricity from such renewable resources is to cut back on the pollution that comes from burning fossil fuels. And if you take that into account, the best places change.

Researchers at Carnegie Mellon University totaled up the health and environmental benefits of renewable electricity across the U.S. Such benefits range from decreased respiratory disease thanks to less soot all the way to mitigation of climate change.

And when you consider all those factors, a solar panel in New Jersey delivers more overall benefits than photovoltaics in far sunnier Arizona.

The difference comes down to what the renewable power replaces. For example, a wind turbine in West Virginia that cuts down on coal burning avoids 33 times as much health and environmental damage as would the same wind turbine in California. The analysis is in the Proceedings of the National Academy of Sciences.

So where’s the best place to put renewable power? According to this study, we need to be building wind farms in Indiana. They’ll fit in nicely among the corn.

—David Biello


Climate change will disrupt energy supplies, DOE warns

Wendy Koch, USA TODAY 8:42 a.m. EDT July 12, 2013


U.S. energy supplies will likely face more severe disruptions because of climate change and extreme weather, which have already caused blackouts and lowered production at power plants, a government report warned Thursday.

What’s driving these vulnerabilities? Rising temperatures, up 1.5 degrees Fahrenheit in the last century, and the resulting sea level rise, which are accompanied by drought, heat waves, storms and wildfires, according to the U.S. Department of Energy.

“It (climate change) is a very serious problem and it will get worse,” says Jonathan Pershing, who oversaw the report’s development. While impacts will vary by region, “no part of the country is immune,” he says. He adds that climate change is exacerbating extreme events.

“Sea level rise made Sandy worse,” Pershing says, noting that it intensified flooding. When the superstorm slammed the East Coast last year, it took down power lines, damaged power plants and left millions of people in the dark.

The report comes one week after President Obama, describing climate change as a threat to future generations, called for action to address the problem “before it’s too late.” He said he aims to cut heat-trapping greenhouse gas emissions from new and existing power plants.

Echoing other research, the DOE report makes the case for why such reductions are needed. It says coastal power plants are at risk from sea level rise and power lines operate less efficiently in higher temperatures.

“The report accurately outlines the risks to the energy sector in the United States” and should serve as a “wake-up call,” says Jennifer Morgan, deputy director of climate and energy at the World Resources Institute, a non-profit that advocates for sustainability.

The report cites prior climate-related energy disruptions. Last year in Connecticut, the Millstone Nuclear Power Station shut down one reactor because the temperature of water needed to cool the facility — taken from the Long Island Sound — was too high. A similar problem caused power reductions in 2010 at the Hope Creek Nuclear Generating Station in New Jersey and the Limerick Generating Station in Pennsylvania.

Reduced snowpack in the Sierra Nevada mountains last year cut California’s hydroelectric power generation 8%, while drought caused the U.S. Army Corps of Engineers to stop the transport of oil and coal along the Mississippi River, where water levels were too low, according to the report. Also, in September 2010, water levels in Nevada’s Lake Mead fell to a 54-year low, prompting a 23% loss in the Hoover Dam’s generation.

While climate change is not the sole cause of drought, climate scientists say rising temperatures can exacerbate it by causing more moisture to evaporate from the soil. They say those temperatures, which the third federal National Climate Assessment says could rise 3 degrees to 10 degrees Fahrenheit by 2100, will contribute more to drought in the future.

In Texas, which is suffering a three-year drought that now affects 87% of its land, conflicts are arising over the water-intensive process of extracting oil or natural gas from shale deposits, known as hydraulic fracturing, or fracking. In 2011, Grand Prairie became the first in the state to ban city water for fracking. Other municipalities have restricted water use for that purpose.

Nationwide, 47% of fracking wells are in water-stressed areas, according to a report in May by Ceres, a Boston-based non-profit that promotes corporate sustainability.

The DOE report cites research indicating that nearly 60% of current thermoelectric power plants, which need water cooling to operate, are located in water-stressed areas.

It says higher temperatures will boost the demand for air conditioning, which could threaten energy security by forcing the nation’s power system to operate beyond ranges for which it was designed. It cites a study by DOE’s Argonne National Laboratory that found such peak demand, given current population levels, will require additional electricity equal to 100 new power plants.

The dire tone of the DOE report, while warranted, can “give a reader a sense of fatigue,” says Joe Casola, a senior scientist at C2ES, formerly the Pew Center on Global Climate Change. Yet he says it also points to solutions such as water-efficient technologies and protection for energy infrastructure.

“It’s technologically within our means to address some of these issues now,” Casola says. “There are a lot of things we can do.”

DOE’s Pershing agrees. “It’s a problem we need to work on,” he says. He notes that the billions of dollars in losses already incurred from climate-related disasters show the need for additional measures.



Combat air forces to resume flying

from Air Combat Command Public Affairs


7/15/2013 – JOINT BASE LANGELY-EUSTIS, Va. — Combat Air Forces units from multiple commands will begin flying again July 15 after many stopped flying in April of this year due to sequestration.

The restored flying hour program represents $208 million of the $1.8 billion reprogramming allocation authorized by Congress. The money re-instates critical training and test operations for the CAF fleet across the Air Force for the remainder of FY13. This impacts not just Air Combat Command units, but also CAF units assigned to United States Air Forces Europe and Pacific Air Forces.

For ACC, the restored flying hours will be allocated to combat aircraft and crews across the command’s operational and test units, including the Air Warfare Center’s Weapons School, Aggressors and the Thunderbirds aerial demonstration team.


While the return to the skies means a return to crucial training and development for pilots, navigators, flight crews, mission crews and maintainers, the leader of the Air Force’s CAF fleet cautions that this is the beginning of the process, not the end.

“Since April we’ve been in a precipitous decline with regard to combat readiness,” said Gen. Mike Hostage, commander of Air Combat Command. “Returning to flying is an important first step but what we have ahead of us is a measured climb to recovery.”

“Our country counts on the U.S. Air Force to be there when needed–in hours or days, not weeks or months,” General Hostage said. “A fire department doesn’t have time to ‘spin up’ when a fire breaks out, and we don’t know where or when the next crisis will break out that will require an immediate Air Force response.”

The restoration of flying hours only addresses the next two and half months of flying up until October 1.

“This decision gets us through the next several months but not the next several years,” the general said. “While this paints a clearer picture for the remainder of FY13, important questions remain about FY14 and beyond. Budget uncertainly makes it difficult to determine whether we’ll be able to sustain a fully combat-ready force.”

Additionally, the restoration comes at a cost to future capability, including reduced investment in the recapitalization and modernization of the combat fleet.

“We are using investment dollars to pay current operational bills, and that approach is not without risk to our long-term effectiveness,” General Hostage said. “We can’t mortgage our future. America relies on the combat airpower we provide, and we need to be able to continue to deliver it.”


Why the enterprise can’t shake its email addiction

Forget new (and better) technologies — email is as entrenched in the business world as it’s ever been. Here’s why we can’t break free.

Howard Baldwin

July 15, 2013 (Computerworld)


Atos CEO Thierry Breton caught a lot of flak last year when he announced he wanted his employees to give up email, but he may have been onto something.

Kids these days don’t use email — digital market research company comScore found that use of Web-based email dropped 31% among 12- to 17-year-olds and 34% among 18- to 24-year-olds in the period between December 2010 and December 2011.

And consumers in general are also off email. The Radicati Group, which tracks use of email and other messaging media, projects the number of consumer emails will decrease by 3% to 4% each year between 2012 and 2016 (see chart, below right).

Then again, there was a reason Breton came in for so much derision: Enterprise email isn’t going anywhere. Or, more precisely, enterprise email usage isn’t going anywhere but up. Radicati is projecting the number of business emails to increase by 13% every single year between now and 2016.

For businesspeople, that means more time scrolling through the inbox (not only on PCs and laptops but now on tablets and smartphones) clicking past newsletters, social media notifications and spam in search of the messages they truly need to do their jobs, and then later filing, archiving and retrieving those messages.

For IT, that means more complaints from users about storage limits being too low (especially when Google lets them keep everything), as well as worries about security, archiving, retention, e-discovery, deletion and syncing mail between mobile devices. And then there’s the cost: In 2010, Gartner estimated that the various costs tied to email add up to $192 per user per year.

Why do we subject ourselves to this madness? Because for all its aggravations, email works. “It’s still an efficient way of communicating, almost in real time,” says Phil Bertolini, CIO of Michigan’s Oakland County, who’s responsible for 10,000 email boxes.

“It does what it’s designed to do quite well, which is allow us to securely communicate on a one-to-one or one-to-few basis,” says Rob Koplowitz, an analyst at Forrester Research.

Simply put, we may hate email, but we can’t work without it. But CIOs and messaging experts agree that something must change that if enterprise email volume is going to boom the way Radicati’s numbers indicate. Email is going to have to get more sophisticated and, at the same time, easier to use. And the people doing the using, who often make life harder for themselves, need to evolve, too.


Why We Love Email

We love email because it’s useful and ubiquitous. It keeps us connected and updated without requiring sender and recipients to be online at the same time, thanks to its asynchronous nature. Everyone doing business today can reasonably be expected to have an email address, whereas only some people use alternative tools like chat, videoconferencing or SMS texting.

Beyond that, email creates a de facto audit trail with a record of who sent what to whom when. And, barring space limitations, that trail is readily available on one’s computer.

The result of this success? “Nobody can live without it for more than two minutes,” says Sara Radicati, president and CEO of The Radicati Group.

From Unix mail (b. 1972), IBM PROFS (b. 1981) and DEC All-In-1 (b. 1982) to email clients, integrated email (think Lotus Notes) and Web-based mail to today’s cloud-based options, email has evolved because we have needed it.

Bertolini is a big fan of email — since the public sector is still heavily paper-based, email still counts as a big technological step forward. “We can chase new technologies, but I need something that’s trusted and used by the masses. Even though there are people clamoring for newer ways to communicate, email is our main form of communication,” he says.


Why We Hate Email

Unfortunately, email’s positives — its utility and ubiquity — have become its negatives as well.

Consider this complaint: “It doesn’t matter if the message comes from a spammer hawking Viagra, your wife asking you to pick up some wine, your boss telling the company that Monday is a holiday, or a client asking for a meeting at his office at 11 a.m. In today’s inboxes, all email messages are equal,” journalist Om Malik wrote six years ago, in 2007. If anything, the situation has only gotten worse.

The problem, says Koplowitz, is that “we use email for things it wasn’t designed to do.” Hooked on email, users default to it for scheduling, workflow planning, resource management, archiving, document management, project management and even knowledge management. Often, ideas that should be shared widely are locked up in an email chain among a narrow list of recipients. “The things it does poorly have become problematic,” Koplowitz sums up.

Over the years, developers have tried to break through users’ dependence on email with software that’s more sophisticated and better suited to certain enterprise tasks — often with only limited success.

Knowledge management systems, touted in the 1990s as the next big thing, failed to catch on, while collaboration systems such as Lotus Notes and Microsoft SharePoint have been variously successful; the inclusion of Chatter into the system serves specific needs of salespeople.

But typically these systems have failed to become as widespread as email because, while they offered a solution that may indeed have been superior to email, they did so only for a narrow population of users.

“There’s a high correlation in the success of these tools when they’re aligned with recognizable business value,” says Koplowitz. Unfortunately, he adds, there’s frequently an organizational mismatch. The tools that work for one department (e.g., sales) may not work for another (e.g., customer service).

And when a new communication tool like Yammer or Chatter does take hold throughout the enterprise, what happens? Users route their notifications to the one place they’re most likely to see them first — the omnipresent email inbox.


IT’s Email Burden

For IT, email is an ongoing headache. Niraj Jetly, CIO at Edenred USA, the Newton, Mass.-based U.S. division of a global developer of employee benefits and incentive solutions, cites a quartet of hassles: the sheer volume of messages; compliance and security concerns; the risks that arise when users access corporate email on their personal devices; and international routing problems.

“No one can support ever-increasing mailbox sizes,” he says. “At the same time, we have to ensure the safety and security of sensitive data being transmitted. We have to ensure the availability of emails archived by users on their laptops or desktops.”

As a divisional CIO within a multinational organization, Jetly also says getting email from continent to continent is a challenge. “It gets very tricky when different government [regulations] and private-sector contracts restrict email routing,” he explains. For instance, certain Payment Card Industry Data Security Standard mandates require that emails originating in the U.S. stay in the U.S.

The bring-your-own-device trend also worries him. “If an organization needs encrypted email but also supports BYOD, supporting access to corporate email on personal devices becomes a never-ending challenge,” Jetly says. “And if a user loses a personal device, who has liability for the loss of data?”

Pete Kardiasmenos, a systems architect at SBLI USA, manages the New York-based insurance company’s Exchange servers and gets involved with “anything relating to email.” His biggest issue: users turning to free external email systems, such as Yahoo Mail and Gmail, to circumvent corporate storage limits.

“They don’t have bad intentions. They want to know why they’re limited to 500 megabytes when Gmail is unlimited. It’s because the more space you have, the more time backup takes, the more complicated disaster recovery is. We have to constantly communicate our policies,” he says. Like a lot of big enterprises, SBLI USA has had to block access to public email systems from company-owned computers as a security measure, and it has had to limit space in Exchange for most users because of the cost of storage.

Even then, he says, email is still a headache. “People keep email in their inbox the same way they keep files on their desktop, to keep them handy. They send the same file back and forth as an attachment until you have 10 versions that you have to store.”

For Oakland County’s Bertolini, management is the challenge — managing passwords, and managing Outlook’s .pst backup files when they get too big. At least, he says, when those files get too large, they start to generate error messages. “We find out about it when [users] have a problem,” Bertolini says with a sigh.

“In one case, we discovered thousands of emails dating back to 2001,” Bertolini recalls. “And the real problem is that most of them dealt with trivia like meeting for lunch. There’s a cost to maintaining and managing email over time.”

IT’s biggest email-related burden is simply uptime, says Radicati. “The overriding concern for IT is making sure that it’s up and running and available,” she says.


Human Roadblock

Email’s People Problem


Is the enterprise’s email addiction rooted in technology or in user behavior? Both, analysts say. “Email is only as good as the person who organizes it,” observes Sara Radicati, president and CEO of The Radicati Group, which tracks use of email and other messaging media.

Over the years, enterprise email systems have added an ever-increasing number of sophisticated organizational tools, but “users still have to train the system, which is where it breaks down,” Radicati explains. “Users forget how they set it up a certain way, and why. Somebody who is highly organized and structured will do well with these tools, and someone who is naturally chaotic will be chaotic.”

Adam Glick, Microsoft’s product manager for Exchange and Outlook, acknowledges that “you can change the tools, but you can’t change the people.” Citing one example of how the tools are changing, he notes that the current version of Office 2013 has an option that lets users ignore any email with a particular subject line if that thread has become irrelevant to the recipient. On a grander scale, Exchange and Outlook are becoming more of a communication hub, with greater integration of chat and unified communications, Glick says.

But all those advances will be meaningless if people don’t take advantage of the new functionality — and IT must help them do that.

“IT needs to explain how and when to use these features,” says Radicati, “and people need to learn to improve their efficiency.”


— Howard Baldwin


Email in the Cloud

So what’s IT supposed to do? Certainly, the cloud offers one of several ways to view email differently. Radicati is optimistic about email in the cloud. “It’s absolutely the way to go,” she says. “A lot of cloud-based email providers have archiving and compliance capabilities in place, and if you want more features, you can purchase them as an additional capability.”

In Oakland County, Bertolini is investigating using Microsoft Office 365 in the cloud. “There’s still a cost associated with storage, but part of our ROI analysis will be comparing the cost of storage in the cloud versus letting people keep more email,” he says, adding that he’s worried that if “you give them more storage, they will fill it up.”

But he also sees other advantages. “If I can host email externally and still have the safety and security the county government needs, I can save millions in the long term. We’d need two to three people to manage Microsoft Exchange, but if I go to the cloud, I don’t need those people. And in three or four years, I’m not replacing my mail servers.”

Still, questions remain. “A lot of IT departments are investigating moving email to the cloud,” Radicati says, “but there is still concern about whether it will be private enough, secure enough and reliable enough.”


Merging Communications Tools

Like many systems IT has to deal with, email’s boundaries are expanding, which means IT needs to begin thinking about email less as a silo and more as one component of a multimodal communications system.

Bertolini notes that the new generation of employees clamors for instant messaging — and he’s not against it. “They use it to collaborate. When they have chat, they can get things done in real time.” He’s also looking at more videoconferencing, first on a one-to-one basis from desktop to desktop, and then from conference room to conference room, and then into a multipoint video arraignment system for the public safety team, because it saves having to transport the county’s prisoners among facilities.

Fortunately, these communication mechanisms will start to merge, analysts predict. Two to five years from now, email won’t look tremendously different, but we won’t talk about it as a stand-alone tool as much as we do today, says Radicati. Instead, we’ll have a communications dashboard that includes email, instant messaging and social media.

These hubs will come about thanks to new open APIs, not only for social media applications like Facebook and LinkedIn, but also for unified communications protocols like Session Initiation Protocol (SIP) and Extensible Messaging and Presence Protocol (XMPP).

Forrester’s Koplowitz concurs. “Over the next few years, we’ll see greater integration across these tools. Think about how messaging is integrated into Gmail — you don’t have to switch back and forth because they’re all integrated together,” he says, citing similar functionality in systems from IBM (with Connections and Notes), Microsoft (with SharePoint and Yammer) and Facebook.

“We’ll have a new environment with new aspects of communication,” Koplowitz predicts. “Today they’re different tools, but in the next three to five years, they’ll be integrated.”

A Silicon Valley-based freelance writer, Baldwin is a frequent Computerworld contributor.

This version of this story was originally published in Computerworld’s print edition. It was adapted from an article that appeared earlier on


Top of Form

The tech that will make drones safe for civilian skies

By Patrick Marshall

Jul 12, 2013

Unmanned aircraft have proven their capabilities in Afghanistan and Pakistan, undertaking reconnaissance and combat missions without putting the lives of pilots at risk.  And now they’re coming home. “We are not darkening the skies yet,” said Richard Christiansen, vice-president of NASA contractor Sierra Lobo Inc., “but we are poised.”

Unmanned aerial vehicles (UAVs) are already used in domestic airspace. Police departments have tested them for surveillance operations, for example, and state environmental departments have used them to survey forests and wildlife. And the Department of Homeland Security has a fleet of eight drones tasked to monitor activity at borders.

So far, however, these deployments have only been permitted under carefully monitored exemptions to Federal Aviation Administration rules. But the FAA plans to integrate unmanned aircraft into civilian airspace by 2015, and it is currently in the process of selecting six locations around the county to explore the potential extent of such integration.

The major barrier to wider deployment is that current FAA rules require the pilot of a UAV to maintain line-of-sight contact with the aircraft. If that limitation is removed, and UAVs are integrated into civilian airspace, analysts expect the market to grow rapidly. An industry trade group, the Association for Unmanned Systems International, projects sales of $90 billion over the next decade.

In the meantime, the rush is on to develop “sense-and-avoid” systems that will allow untethered flights.

Researchers agree that the basic technologies are already available to deliver effective collision warning and avoidance systems.  The challenges, they say, are primarily in engineering and systems integration.

“We know we can technically do it,” said Sanjiv Singh, research professor at Carnegie Mellon University’s Robotics Institute. “The question is whether we can do it within all of the other constraints.” And when it comes to UAVs, especially small UAVs, the primary constraints are size and weight.

Whether a sense-and-avoid system uses electro-optical cameras, laser radar (LIDAR) devices or transponders, the challenge is to make the devices small and light enough to be deployed on small UAVs. “It’s getting close,” said Ian Glenn, CEO of ING Robotic Aviation, a Canadian manufacturer. “We’re making them smaller and smaller. Absolutely the technology will get there.”

ADS-B taking off

According to Glenn, the simplest way to protect against mid-air collisions — whether the aircraft has a pilot or not — is to require the use of ADS-B transponders on all aircraft.  “These transponders can turn an uncooperative environment into a cooperative environment,” he said.

ADS-B (automatic dependent surveillance-broadcast) transponders not only broadcast aircraft location in real time, they also deliver information on altitude and velocity. What’s more, they can deliver data on other aircraft, weather and terrain to the vehicle that is carrying them. In fact, ADS-B transponders will replace radar as the primary technology for tracking air traffic,  and the FAA will require the majority of aircraft operating in U.S. airspace to be equipped with ADS-B by Jan. 1, 2020.

Using ADS-B transponders on many UAVs was infeasible until recently, when Sagetech Corp., an avionics company based in White Salmon, Wash., developed one that weighed only 3.5 ounces.

Last November, ING Robotic Aviation equipped one if its 22-pound Serenity UAVs and manned Harvard Mark IV, a fixed-wing, single-engine aircraft operated by Canada’s National Research Council, with Sagetech ADS-B transponders and Clarity receivers. The two aircraft were then flown for 90 minutes above an Ottawa air park.
“We were able to prove that we could fly, and they could see us a long way away,” said Glenn. “We were able to pick up [data on] traffic coming to Ottawa even as it was crossing New York’s border.” 

Even better, he said, the Clarity receiver is independently powered, which means that even older aircraft can adopt the technology. “One of the arguments by regulating agencies is, ‘Well, that’s great, but we have people flying Sopwith Camels. They don’t even have power supplies, and they don’t have a radio,’” said Glenn. “Well, here’s a little box that you can stick on the dashboard, with a battery that will last longer than your bladder. And it has its own little Wi-Fi and it’ll display on an iPad.”

One system for all sizes

Until all aircraft are equipped with ADS-B transponders and receivers, of course, UAV sense-and-avoid systems must rely on other technologies to detect other aircraft and to undertake evasive maneuvers.

One of the most ambitious civilian efforts under way to develop an integrated system for UAVs is the Mid-Air Collision Avoidance System (MIDCAS), which is being developed by five European countries — Sweden, Germany, France, Italy and Spain — and 11 industrial partners.

The four-year, $65 million project is expected by 2014 to deliver an automated sense-and-avoid system that will not depend on transponders. While it is being designed to integrate ADS-B, MIDCAS also includes two visible-band electro-optical cameras and one infrared camera for aircraft to use in identifying other aircraft. In addition, the team’s developers are designing image-processing algorithms, processing units and integration with on-board avionics.

Key to the project, said Saab Aerosystems’ Johan Pellebergs, MIDCAS project manager, is developing a generic set of sensors and processing modules. “By generic, we mean that it should be able to work on any type of UAS,” Pellebergs said. “It should be adaptable. So we try to keep all of the vehicle-specific parts well contained so that they can easily be adapted to all the different types. The variety in UAS is very big, ranging from the Global Hawk, which is very big, all the way down to small ones that you can hold in your hand.”

Pellebergs said the international team has developed a prototype system and is ready to test it on a manned aircraft. “The collision avoidance part is fully automatic,” he said. “The remote pilot does not need to do anything. If the system detects something, it calculates when it needs to activate. And when the aircraft gets to that point, it triggers and executes the moves automatically.”

It is the system’s control over evasive maneuvers that requires adaptability to each model of UAV. “That’s where the vehicle specifics come in,” Pellebergs said. “You need to be able to model the performance and limitations of each of the vehicles. There are large differences between air speed and maneuverability in these vehicles.”

That’s one reason MIDCAS is working closely with manufacturers of UAVs and sensors.
Another challenge has been designing the software to process the various sensor data.  According to Pellebergs, “The data fusion module takes the information from different sensors and makes one picture. Then it is sent over to the avoid part, where you calculate the maneuvers and execute them. It also sends information down to the ground control station.”

Low-altitude hazards

Of course, the hazards for aircraft — manned and robotic — aren’t limited to other aircraft. 

“There are very few things above 500 feet to run into,” Singh said. But landing, taking off or operating under 500 feet — which is where many UAVs are designed to spend most of their flight time — there are many hazards, including trees, buildings and wires. “You have to go up and come down,” noted Singh. “I think that last hundred feet is pretty important.”

Accordingly, Singh has set his sights, in a sense, lower. “I work on the aspect of UAVs flying intelligently so that they can fly in what we call ‘near-earth environments.’ They are aware of their environment, they are aware of what they can do, they are aware of environmental conditions like wind, and then they plan their actions in such a way that they can stay safe.”

And being closer to the ground introduces other challenges. “Maybe you need to fly close to things, so the GPS is blocked by trees and buildings,” he said. “Maybe you need to operate in dusty conditions or at night. The problem is complex.”

As a result, Singh is working to integrate a variety of sensors and to develop the software to make them usable in UAVs. In addition to the visual sensors (cameras and infrared imagers), Singh is working to incorporate far infrared (effective for detecting features through fog or rain), radar (which can penetrate obstacles) and LIDAR (which is effective in detecting contours of objects). 

Singh was part of a team that recently enabled a full-size, autonomous helicopter to fly at low altitude, avoid obstacles, choose a landing site in unmapped terrain and successfully land. In June 2010, the team tested the sensor and navigation system at a Boeing test facility in Mesa, Ariz. Employing a laser scanner and 3D mapping software, the unmanned helicopter was able to avoid a 60-foot crane and high-tension wires, as well as other smaller obstructions, such as four-in-high pallets, chain-link fences, vegetation and even people.

Government in the way?

While there is a long way to go before completely autonomous UAVs can safely operate in all environments and conditions, researchers say the basic technologies are already in place that would allow for widespread deployments right now if government would move to set standards.

“The real challenge is not technological,” Glenn said. “The real challenge is regulatory acceptance. I think we’re close enough. The key is that we are able to be as good as manned aviation. So the issue is how to get federal aviation authorities around the world to get their minds around it.”

According to Glenn, if government regulatory agencies would specify the performance standards UAVs need to meet, he can design appropriate equipment. “You tell me what the requirement is, and I will build it,” he said. 

Pellebergs agrees. “No one really knows what the requirements are for sense-and-avoid for UASes in civilian airspace, so we need to get a set of standards in place,” he said. “I think that’s what’s holding up a lot of the progress in this area.”

Unfortunately, the Federal Aviation Administration declined our requests for comment.


Colorado town to consider drone hunting license, bounty

Dylan Stableford, Yahoo! News

July 17, 2013

A small town in Colorado is considering an ordinance that would create a license and bounty for hunters to shoot down drones.

“We do not want drones in town,” Phillip Steel, the Deer Trail, Colo., resident who drafted the ordinance, told Denver’s ABC7 affiliate. “They fly in town, they get shot down.”

Steel’s proposal, recently submitted to the town board, calls for a $25 drone hunting license and outlines “rules of engagement” for hunters looking to shoot down the unmanned aerial devices:

The Town of Deer Trail shall issue a reward of $100 to any shooter who presents a valid hunting license and the following identifiable parts of an unmanned aerial vehicle whose markings and configuration are consistent with those used on any similar craft known to be owned or operated by the United States federal government.

Steel said that while he’s never seen a drone flying in Deer Trail, the ordinance is a “symbolic” one.

“I do not believe in the idea of a surveillance society, and I believe we are headed that way,” he said.

According to the U.S. Census Bureau, Deer Trail’s population was 559 in 2011.

“They’ll sell like hotcakes,” Steel said of the proposed drone license. “It could be a huge moneymaker for the town.”

David Boyd, one of Deer Field’s seven board members, supports the drone ordinance.

“Even if a tiny percentage of people get online (for a) drone license, that’s cool,” Boyd said. “That’s a lot of money to a small town like us. Could be known for it as well, which probably might be a mixed blessing, but what the heck.”

There’s even talk of the town—which claims to be home to “the world’s first rodeo”—hosting the world’s first drone hunt. “A skeet, fun-filled festival,” town clerk Kim Oldfield said.

The board will consider the drone hunting ordinance on Aug. 6.


Air Force 2027: Fewer pilots, more drones, more challenges

AF Times

Jul. 12, 2013 – 06:00AM |

By Stephen Losey

The Air Force of the future is likely to be slightly smaller and more reliant on remotely piloted aircraft, face growing challenges from the rise of Asia and rapidly increasing space traffic, and struggle to maintain its technological superiority as the United States produces fewer scientists, engineers and other highly skilled graduates.

And to survive, former Chief Scientist Mark Maybury said in a June 21 report, the Air Force is going to need to adopt several “game-changing” strategies to keep up with emerging challenges and global threats between now and 2027. That could include adopting speedy acquisition strategies from commercial space companies such as SpaceX, making greater use of lasers and other directed energy weapons, and adopting advanced manufacturing techniques such as 3-D printing.

“If we presume the future is going to look like today, I think we are going to be sorely mistaken,” Mica Endsley, the Air Force’s new chief scientist, said at a July 11 breakfast hosted by the Air Force Association. “The future environment, even in the next decade, has some really significant potential threats that we need to be aware of and we need to be thinking about carefully.”

In the report, called “Global Horizons: United States Air Force Global Science and Technology Vision,” Maybury said that the Air Force’s manned air fleet is likely to shrink slightly by 2027. But the Air Force’s fleets of remotely piloted aircraft and their missions are likely to grow significantly.

The United States also needs to pay attention to the worldwide proliferation of RPAs as their cost plunges, Endsley said.

“It’s not just us, and that’s the big change that we need to take a look at,” Endsley said.

The report said that most RPAs today are designed for intelligence, surveillance and reconnaissance, but nations are investing significantly in combat drones. Some may even be capable of delivering weapons of mass destruction such as chemical, biological or nuclear weapons, and the U.S. needs to develop ways to detect and defeat them, the report said.

Advanced fighter aircraft are also likely to spread throughout the world, the report said, which could shrink the U.S. air superiority advantage. By 2025, the report said, 70 percent of foreign combat air forces will be made up of modern fourth- or fifth-generation aircraft, and adversaries are also likely to gain advanced systems, such as advanced missiles and other munitions that could challenge the Air Force.

One possible way the Air Force could counter other nations’ improving air capabilities is to develop and deploy high-powered lasers, the report said.

“Nothing moves faster than light, and advances in efficiencies, power levels, thermal management and optics made directed energy weaponry a game-changing contender,” the report said.

Adopting a more adaptive architecture for aircraft, including “plug-and-play” modular components like sensors and seekers, will allow the Air Force to quickly upgrade weapons systems, Maybury said. However, this approach also carries its own threats unless the Air Force builds in strong cybersecurity.

“Agility has to extend into our technologies,” Endsley said. “We can’t just build a system that is going to be stationary, that maybe we’ll upgrade someday in the future. We have to make sure the systems we’re building are built … in a modular way, in such a way that they can be modernized and upgraded very efficiently.”

And to make its acquisition process more nimble, Maybury said the Air Force of the future needs to learn a few lessons from SpaceX, Scaled Composites and other small, private space exploration companies.

The Air Force’s current acquisition process is incapable of producing innovative systems quickly and affordably, the report said. And the increasing complexity of integrating advanced technology into aircraft such as the F-35 will likely further slow the development process in the future. This “threaten[s] to erode the current decisive advantage” the Air Force now enjoys over its adversaries, Maybury said.

Maybury said the Air Force needs to emulate the rapid prototyping processes used by SpaceX and Scaled Composites, which he said produce aerospace vehicles 50 percent faster than under traditional acquisitions. SpaceX produced the Dragon capsule, which last year became the first commercial vehicle to dock with and deliver supplies to the International Space Station, and Scaled Composites won the Ansari X Prize for its SpaceShipOne, the first private manned spacecraft.

The Air Force needs to refocus its prototype efforts to provide early proofs of concepts and reduce technical uncertainty, the report said. And emphasizing technology demonstrations and creating open challenges could lead to more innovative technological breakthroughs, fill gaps in the Air Force’s capabilities, reduce risk aversion and energize the workforce, the report said.

“The key to doing science and technology well is making sure we build it on a platform that involves rapid innovation and effective testing,” Endsley said. “A lot of what we’ll be looking at is how can we enable that, both within our labs and in industry, to make sure that we are building these systems in effective ways, instead of investing a lot of money and then finding out way down the line that something’s not working like you thought it would.”

Maybury also said the Air Force could cut its development cycle time by 25 percent by using advanced, physics-based modeling and simulation tools. Those tools could help the Air Force assess how feasible and expensive it would be to integrate technologies into a system, identify technology that isn’t ready to be incorporated into systems, quantify risk at critical decision points, and avoid discovering defects late in the development process.

The report also said that using small, low-cost launch capability being developed by commercial industries such as SpaceX would allow the Air Force to more easily access space. This will become more important as space becomes increasingly competitive and congested, and more nations launch satellites and other spacecraft.

The nation’s eroding manufacturing base endangers the Air Force’s ability to design, develop, manufacture and deploy reliable and advanced technologies, the report said. But the rise of 3-D printing could help counter that decline. For example, 3-D printing could allow researchers to rapidly print parts needed for prototypes. And airmen in the field could 3-D print parts quickly to repair battle-damaged systems.

And the Air Force should also expand its use of a flexible hiring authority program, called the Laboratory Personnel Demonstration Project, to the entire acquisition workforce, the report said. This would allow the Air Force to hire scientists, engineers and other technical workers 70 percent faster than under the normal hiring process, which sometimes takes five months or more.

But Maybury is also concerned about the increasing competition worldwide for top scientists and engineers, and the United States’ decline in science, technology, engineering and mathematics, or STEM, graduates. This could erode the nation’s advantage in producing new technologies, he said.


Sequester, Part II,Gives ObamaControl of the Budget 1/3

By ERIC PIANIN, JOSH BOAK, The Fiscal Times July 1 6, 2013

Congress appears to be sliding toward a second year of sequestration—meaning the slashed budgets of this past y ear will soon receive an equally unpopular sequel. That might just mean more power for President Obama, as the executive branch must juggle budgetary choices that Congress refuses to make. The same political gridlock that set off the decade-long sequestration earlier this y ear has only worsened in recent months. House Republicans and Senate Democrats have proven incapable of agreeing to the basic contours for a 201 4 budget. Nor can they work together on immigration reform. And with the Farm Bill, the fate of the food stamp program remains up-in-the-air.

What was once unthinkable—broad cuts in defense and domestic discretionary spending—has become standard operating procedure. The administration still bemoans the sequester cuts that were introduced in a failed attempt to force a compromise on deficit reduction in the 201 1 Budget Control Act. Sequestration has had serious impact on some government employ ees and recipients

of federal aid, but it never kicked the legs out from under the recovery . In fact, the economy is expected to improve as the spending reductions mount. And if Congress cannot provide an alternative to the sequester, it may unwittingly give the White House more control than it already has over how federal dollars are spent.

President Obama has not engaged House Republicans on replacing the sequester, while the issue has been on the backburner until Congress must confront the need to increase the government’s borrowing authority this fall. By October, Congress must also finalize a budget for Obama, or continue to lean on continuing resolutions. “I don’t see a viable majority in the two houses of Congress in fav or of getting out of it in a way that the president would accept,” said William Galston, a budget expert with the Brookings Institution. “And as far as I can tell, there hasn’t been a lot of serious discussion for months.”

“I’d have to say at the moment the chances are in fav or of a second round because it is the status quo and it’s been very difficult for Congress to change the status quo,” said Robert Bixby , executive director of the Concord Coalition, a budget watchdog group. “They would have to change the law, and agreement has been awfully hard to come by .”

Congressional leaders and the administration could still surprise the nation and scratch out a deal. But any compromise would likely have to preserve more than $1 trillion worth of deficit savings over 1 0 y ears. The government can still survive on ad hoc measures that raise the debt ceiling on a short-term basis, while enabling the White House to blunt the impact of sequestration.

For sure, neither side relishes having another crisis like the near default on the U.S. debt in the summer of 201 1 . But Obama has pretty much abandoned hopes for a “Grand Bargain” of spending cuts, tax increases and entitlement reforms to keep the government on a long-term path to deficit reduction.

At the same time, many on Capitol Hill have concluded that the nearly $90 billion worth of sequester cuts that began to take hold this y ear weren’t nearly as harmful to the economy or government programs as Obama and others claimed they would be.

Outside analy sis by the Federal Reserve backs up this sentiment. When the spending cuts and government employ ee furloughs began, other parts of the economy —such as a rebound in home prices—kept growth going. Fed officials noted last month that gross domestic product should increase by more than 2 percent this y ear, despite the 1 .5 percent hit caused by sequestration.

And while congressional budget leaders and appropriators would much prefer to make decisions on future savings, there is a growing tendency to either kick the can down the road, or essentially leave it up to department and agency heads to figure out how to absorb the mandated cuts that will once again total about $90 billion for the fiscal y ear starting on Oct. 1 .



Steve Bell, a former GOP Senate budget adviser and now a senior director at the Bipartisan Policy Center, contends that Congress has been gradually and inadvertently “handing control of the purse over to the executive branch” by allowing sequestration to continue. Relatively anony mous bureaucrats are now making budgetary choices that once belonged to committee chairmen.

“I wonder when people realize that a series of continuing resolutions, plus the sequester together, has taken most of the influence Congress would have over the agencies and has really damaged that relationship,” Bell told Times.

The arrangement might not increase the level of discretionary spending, but it creates a scenario where the White House should have more discretion over where the money goes. Bell said that conserv ative House Republican Houses – especially the newest members who have blocked efforts to reach bipartisan agreements—are in effect handing over power to the executive branch to make many of those decisions.

“It’s just the opposite of what they think,” Bell said. At the same time, Obama would be able to capitalize on both a shrinking deficit and continued economic growth. Fed officials expect the economy to strengthen in 201 4 and 201 5, estimating that GDP will increase by more than 3 percent next y ear despite the second round of sequestration coming. Macroeconomic Advisers, a priv ate forecaster, estimated that GDP would climb by 3 percent next y ear and 3.4 percent in 201 5 as the economy accelerates “past the epicenter of fiscal restraint,” according to a June 1 0 commentary .

Secondly , the size of the sequestration cuts as a share of the economy will slow over time because much of the harshest chopping has already occurred. While the absolute size of the discretionary spending will decline, the rate of this drop-off will slow after this y ear, according to May estimates by the Congressional Budget Office.

House Appropriations Committee Chairman Hal Rogers (R-KY) has called the sequester situation “idiotic” but said his hands are “tied” by the Budget Control Act, the House-passed budget which insists on the lower spending level. Unable to pass a budget, the government has operated under a continuing resolution that sets discretionary spending at $1 .043 trillion. In order to comply with the Budget Control Act that established the sequester, $7 6 billion would have to be cut from that sum for fiscal 201 4.

That cut would need to be made 1 5 day s after Congress adjourns at the end of the y ear. Implementing it without furloughs or lay offs could be very tough for agency heads, who have already struggled to find one-time savings this y ear, aides said. The Senate-passed Democratic budget would offset the sequester with additional tax hikes on wealthier Americans, but that particular solution has guaranteed that House Republicans will not come to the negotiating table. This makes sequestration—the policy lawmakers saw as a cudgel to force compromise because of its crude approach to deficit reduction—all but inevitable for the next y ear.



While Senate Budget Committee Chairwoman Patty Murray (D-WA) has warned of painful reductions to expenditures for national security programs and parts of the social safety net such as Head Start, the predictions have y et to reverberate on a national level with the economy .

Other than congressional intervention to prevent the furloughs of air traffic controllers and federal meat inspectors, law makers essentially stepped back and allowed the sequester fully take hold. As a result, many on Capitol Hill concluded they could easily live with the sequester this y ear—and presumably the next one as well.

In a sign that the Obama administration is bracing for a second y ear of mandated cuts, Defense Secretary Chuck Hagel informed Congress last week that further reductions would cause a sharp decline in the effectiveness of jet fighter wings, group combat units and even Special Operations forces.

Hagel warned that the Pentagon must identify an additional $52 billion in budget reductions for the fiscal y ear that begins Oct. 1 . What matters is that Congress may have no choice but to give Hagel more flexibility in assessing which cuts to make.

“I strongly oppose cuts of that magnitude,” Mr. Hagel wrote. “The size, readiness and technological superiority of our military will be reduced, placing at much greater risk the country ‘s ability to meet our current national security commitments.”


National commission to visit 13 bases

Posted 7/15/2013

by Col. Bob Thompson

Air Force Reserve Public Affairs


7/15/2013 – WASHINGTON (AFNS) — The “National Commission on the Structure of the Air Force” is planning to visit 13 Air Force locations, beginning with Joint Base McGuire-Dix-Lakehurst, N.J., on July 16.

Appointed by the FY13 National Defense Authorization Act, the commission is reviewing the Air Force’s structure to determine if and how it should be changed to meet future missions and funding challenges. The commission’s report and recommendations are due to the president by Feb. 1, 2014.

“This is one of the biggest issues for the future of the Air Force – to develop the right force mix of Regular and Reserve Component Airmen,” said Lt. Gen. James F. Jackson, chief of Air Force Reserve and commander of Air Force Reserve Command. “Getting this mix right directly affects our Air Force’s capability, capacity, efficiency and cost-effectiveness.”


The commission’s visits include:

July 16:

• Joint Base McGuire-Dix-Lakehurst, N.J.

July 29:

• Wright Patterson Air Force Base, Ohio

• Springfield Air National Guard Base, Ohio

July 30:

• Mansfield ANGB, Ohio

• Rickenbacker ANGB, Ohio

Aug. 5:

• Joint Base Langley-Eustis, Va.

Aug. 13:

• Barksdale AFB, La.

Aug. 20:

• Tinker AFB, Okla.

Sept. 5:

• Pease ANGB, N.H.

• Westover AFB, Mass.

Sept. 6:

• Burlington ANGB, Vt.

Sept. 22-23:

• Camp Smith/Hickam Field, Hawaii

Sept. 25:

• Beale AFB, Calif.


The commission is led by the Honorable Dennis M. McCarthy, a retired Marine Corps lieutenant general and the previous assistant secretary of Defense for Reserve Affairs at the Pentagon. The vice chair is the Honorable Erin Conaton, a former under secretary of Defense for Personnel and Readiness and previous under secretary of the Air Force.

Other members of the commission include: F. Whitten Peters, former secretary of the Air Force; Les Brownlee, former acting secretary of the Army; retired Air Force Gen. Raymond Johns Jr., previous commander of Air Mobility Command; retired Air Force Lt. Gen. Harry M. “Bud” Wyatt III, previous director of Air National Guard; Dr. Janine Davidson was a deputy assistant secretary of Defense for Plans, and Dr. Margaret C. Harrell is the director of the Army Health Program at the RAND Corporation.

During their visits at each location, the commission will tour facilities, hear mission briefings and meet Airmen at wing, group and squadron levels. Also, the commission will hold off-base hearings with state and local community leaders as well as members of the general public.

Senior leaders in Congress and the military are looking at ways to balance the ratio of Regular and Reserve Component Airmen to create the most effective and efficient combat capability in austere budget times.

In January, the secretary and chief of staff of the Air Force established a “Total Force Task Force” to create a process to determine the appropriate force mix. This task force is led by generals from each component: Maj. Gen. Mark Bartman, Air National Guard; Maj. Gen. Brian Meenan, Air Force Reserve; and Maj. Gen. John Posner, Regular Air Force. The task force’s data and findings are a ready resource for the National Commission on the Structure of the Air Force.

“Today’s Air Force Reserve is a combat-ready force with operational capability, strategic depth and surge capacity, to be used by the nation in a variety of ways, either abroad or at home,” said Jackson during his testimony to the National Commission on Jun. 3. “With a shrinking defense budget, increasingly consumed by manpower-associated costs, there is little doubt the cost-effective Reserve Component will continue to provide a valuable role.”

Regular Air Force, Air National Guard and Air Force Reserve Airmen work together around the world as a team in air, space and cyberspace. Today’s “Total Force” consists of about 327,600 Regular Air Force Airmen, 105,400 Air National Guardsmen, and 70,880 Air Force Reserve Airmen actively serving in the Selected Reserve as designated by the FY13 National Defense Authorization Act. The Air Force Reserve also maintains a strategic depth of more than 790,000 stand-by or non-participating Reservists and retirees that can be called up for national emergencies.

After more than two decades of continuous combat operations, the Reserve Components are relied upon now for daily operations as well as strategic surges. By leveraging the resources and talents of all the Air Components, planners are developing better solutions to capitalize on the strengths of each component.

“I believe that working together we can combine the personnel, equipment, and readiness necessary to build a Total Air Force equal to all the challenges our nation faces,” Lt. Gen. Stanley E. Clarke III, director of the Air National Guard, during his testimony to the House Armed Services Committee Subcommittee on Defense, on March 20.


Hagel: Budget cuts mean 20 percent trim of top staff in Army, Navy, Air Force, Marines

Washington Post

By Associated Press, Published: July 16

JACKSONVILLE, Fla. — Defense Secretary Chuck Hagel said Tuesday he has ordered 20 percent “across the top” budget cuts for his Pentagon staff and that of his top brass.


The reductions, which he did not spell out in detail, are for the 2015-19 period. They will apply to his office, that of the Joint Chief’s chairman and also the Pentagon headquarters offices of the Army, Navy, Air Force and Marine Corps.

It is one element of a broader effort by the Pentagon to adjust to forced spending reductions that already have resulted in the furloughing of civilian workers. Hagel said he believed Pentagon headquarters staff must share in the sacrifices.

“That isn’t going to fix the problem,” he told about 100 Defense Department civilian employees in a question-and-answer session at Jacksonville Naval Air Station on the second day of a tour of military bases. “But, yes, everybody’s got to do their part.”

Hagel spokesman George Little later said the top brass cuts could save between $1.5 billion to $2 billion over the five years and will target personnel, including civilians and contractors. He said the cuts will happen even if Congress eases budget caps that have created sharp limits on defense spending.

Military spending was slashed by $37 billion this year, forcing job furloughs that began last week for an estimated 650,000 Defense Department civilian employees. The layoffs do not apply to military members, but they, too, are feeling the effects of a budget squeeze that is reducing some training.

The Pentagon faces the prospect of an additional $52 billion budget cut in 2014 unless Congress and the White House come up with a deficit-cutting plan. Hagel told Congress last week that such a large additional cut would have “severe and unacceptable” effects.


Air Force ‘carving out’ more AF members for cyber


Jul. 16, 2013 – 05:19PM |

By Oriana Pawlyk

Demand for airmen with cyber skills will grow to 1,500 in the coming years, up from earlier projections of 1,000, the Air Force’s head of Air Force Space Command said Tuesday.

“I’m being a little fuzzy on the numbers because the numbers are really being finalized, but … when I say 1,500, we’re not sure where exactly it’s going to settle, but it’s gone up from where we initially thought it was going to be,” said Gen. William Shelton, at a Capitol Hill breakfast.

Space Command announced earlier this year they would be standing up additional cyber mission teams between 2014 and 2016 in support of U.S. Cyber Command.

“I think the future is very bright in space and cyber … there are challenges in budgets, there are challenges in threats, but in every strategic review that I’ve seen, space and cyber are mentioned prominentley as things we have to fund and things we have to protect,” Shelton said.

He explained that regardless of budget setbacks, “[carving] those people out somewhere from within the Air Force structure, and [funding] those civilian billets” remains a high priority within the department.

Another priority Shelton focused on was the Air Force Network, or AFNet. AFNet — which had a relaxed December 2012 deadline — is the breakdown of hundreds of local base IT networks consolidated into one enterprise network. The goal of this project is to collapse all individual or stand-alone Air Force, Air Force Reserve, and Air National Guard networks into a central Air Force Network.

“That’s our number one cyber priority, is getting collapsed down to that single network,” Shelton said.

“The reason we’re doing that is, there are now just 16 touch points to the ‘external world’ — the Internet — within the Air Force. That’s [going to be] much easier to defend, much more consolidated from a command and control perspective, it also allows us more flexibility … much more ‘defendability’ of our networks, and it’s gonna be great once we get it completed this next year.”

Shelton said the Air Force is still “going through the dialogue” with senior leadership in the Air Force on current and future cyber projects.

“There’s no question that we’re going to support what USCYBERCOM is looking for in terms of these national missions,” he said.

“In terms of what the Air Force does and how we manage career fields, how we organize ourselves, we’re not quite there yet.”

STEM Labor Shortage: Myth or Reality?

Professors, trade organizations and technology companies weigh in on this question as the U.S. continues to push for more STEM graduates.

BY TANYA ROSCORLA / JULY 16, 2013 1–Tech-Labor-Shortage-Myth-or-Reality.html

Indiana University Bloomington’s Bobby B. Schnabel doesn’t like the spring as much as the rest of the year. Though that may initially seem a bit strange, he has good reason.

As the informatics and computing dean, Schnabel flies to Silicon Valley often to build relationships with tech leaders. And toward the end of the school year, they start asking him his least favorite question: Do you have any students we can hire?

“Part of my job as a dean of a big school is to know a bunch of the IT CEOs,” Schnabel said, “and sometimes they’ll contact me directly in May saying, ‘We’re looking for people.’ And we don’t have people in May — they’re all hired.”

By the time the university holds its main career fair in early September, the best students are already taken. Of this year’s graduating class, 245 undergraduate and masters’ students answered a university survey, which found that just over half of them lined up a job before graduation, while less than a quarter of them decided to continue their education at a higher level.

But while Schnabel identifies a labor shortage in technology rather than STEM as a whole, not everyone believes we have an undersupply of college graduates and other laborers in this field. In fact, two schools of thought compete against each other on this issue, and both of them cite data to back up their claims. This begs the question, “Is the STEM labor shortage reality, or simply a myth?”



At the University of California at Davis, Norman S. Matloff, a computer science professor, says that the STEM shortage is really a web of deceit designed to trick the whole country.

“There’s a giant deception, a huge public relations concerted effort being engaged on the various parties with a vested interest to implant in the American consciousness this idea that we have a STEM labor shortage,” Matloff said.

Many positions in these fields don’t even require a college degree, much less a doctorate. In fact, less than five percent of jobs in the technology field call for high levels of knowledge.

And the technology field has seen success stories at a high level from people who never earned a college degree.

Take Bill Gates of Microsoft, Steve Jobs of Apple and Larry Ellison of Oracle. All of them started college, but dropped out. Now Gates and Ellison are two of the top five richest people in the world.

That said, bachelor’s degree production in computer science saw double-digit growth between 2009 and 2012. And last year, colleges awarded nearly 2,000 doctorate degrees, the highest number ever reported in the Computing Research Association’s Taulbee Survey.

Yet the demand isn’t there, some argue.

“There is still benefit to inspiring kids to be passionate and excited about STEM disciplines,” said Bob Sun, CEO of Suntex International Inc., which helps students build a solid foundation in math. “But the idea that we need more STEM graduates to meet this unfulfilled demand — I don’t think it’s quite exactly what it’s cracked up to be.”



A professional society called the Programmers Guild argues that the U.S. is producing plenty of computer science graduates and has enough older workers to fill positions — they’re just not getting hired. It says companies won’t hire workers over the age of 35, and that they hire foreign laborers through temporary work visas to bring wages down.

In specialty occupations, these H-1B visas allow employers to hire highly-skilled foreign workers when they can’t find U.S. residents or citizens. The employer files a visa application for these workers, which allows them to work for that employer over three years, or up to six years if they receive an extension.

When these visa holders enter the country, they could stay permanently depending on whether their employer files a petition on their behalf.

“If you’re a foreign worker being sponsored for a green card, you are trapped, you cannot go to another employer, because you’d have to start all of that all over again, and it’s just unthinkable,” Matloff said. “So the employers like that. This immobility is huge.”

U.S. Citizenship and Immigration Services issues 65,000 H-1B visas each year to fill positions in specialty occupations such as computer programming. Plus, it provides 20,000 exemptions for students who earned advanced degrees in the U.S.

“Over the past 10 years, we’ve been bringing more H-1Bs than the job market has been increasing by,” said Kim Berry, president of the Programmer’s Guild. “And so I think it’s no question that Americans are being displaced by this.”

The companies that hire these workers beg to differ. They’re calling on Congress to increase the cap so they can bring in more foreign workers. But in addition to increasing the cap, companies such as Microsoft also want to invest in education to build the U.S. pipeline of computer scientists, said Jack Chen, senior attorney of compliance and advocacy for Microsoft Corp.

At the end of February, the company had more than 6,400 open jobs in the U.S., half of which were for research and development positions.

“There really is a hunger and a thirst for talent,” Chen said, “that has not been satiated by the workers we have available in the U.S. market.”

Both schools of thought hold such opposing views that they may not agree anytime soon. And that leaves us to decide for ourselves where we stand on the STEM labor issue.


Kremlin’s Anti-Hacking Tech: Typewriters

150-Year-Old Technology Isn’t a Good Idea to Protect Secrets

By Eric Chabrow, July 16, 2013.Follow Eric @GovInfoSecurity

The apparatchiks at the Kremlin think they’re clever sorts with plans to replace computers with typewriters to prevent the American e-spies at the National Security Agency from hacking into Russian intelligence systems.

The successor to the KGB – Russia’s Federal Protective Service – has placed an order for 20 typewriters at $750 a pop (that’s 24,480 rubles each or 489,603 rubles for the bunch; it’s sounds much more expensive in Russian currency), according to a number of published reports that cite the Izvestia news service.

Of course, the safest way to secure a computer is to keep it off the Internet. (But Stuxnet proved that even high-tech devices not connected to the Internet can be hacked, using infected removable drives to spread the worm to cripple Iranian nuclear centrifuges.) Typewriters go one better.

But the idea of replacing computers with typewriters to mitigate the risk of the United States – or anyone else – stealing secrets is misplaced.

From a practical side, as Sunday Telegraph Chief Foreign Correspondent Colin Freeman points out, correcting mistakes on important documents without the “delete” button would prove difficult:

FOr the chnaces are that ecven if u are reasonabley goood typrer like me, most of what you iwll write will end up litter d with misteks. Sometimes ot the point of bing complely eillegiebe (illegible).”

Even when tongue is removed from cheek, other reasons exist that show replacing a computer with a typewriter is a terrible idea.

Governments have been stealing secrets from one another for years – heck, for centuries – before the invention of the computer and the Internet. Removing documents from a computer and putting them in a vault or locked filing cabinet doesn’t guarantee protection, especially given the prevalence of the insider threat. And, safeguarding documents in transit can be as daunting in the real world as it is in the virtual one.


7,000 Typewritten Pages ‘Hacked’

Though not nation-to-nation intelligence theft, one of the biggest disclosures of classified information in American history was the leaking of the Pentagon Papers, a top secret history that showed how the Lyndon Johnson administration systematically lied about the United States involvement in Vietnam from 1945 to 1967.


Unlike former NSA contractor Edward Snowden and Army Pvt. Bradley Manning of WikiLeaks fame, who copied hundreds of thousands of documents onto electronic drives, Pentagon Papers’ whistleblower Daniel Ellsberg and his friend Anthony Russo photocopied the 3,000 pages of historical analysis and 4,000 pages of original government documents – all typewritten at one point – that they leaked to The New York Times and other newspapers.

The reward for mitigating the risk of abandoning the computer for the typewriter might not be worth the nearly half-million ruble investment. Computers, networks and the Internet have improved productivity and collaboration, and it’s hard to conceive in today’s environment that the work needed to create such crucial documents can be achieved effectively using 150-year-old technology. Simply, contemporary technology is required to protect contemporary content.

Adn thats teh fax; JAck!


Data demand soars as consumers drop landline phones

Dayton Daily News

July 18, 2013

By Dave Larsen

Staff Writer

Demand for mobile and broadband data in Ohio is skyrocketing as more consumers switch from traditional landline phones to wireless communications.

Ohio’s local phone companies have lost 64 percent of their lines since peaking in 2000, and continue to lose lines at a rate of 6 to 10 percent annually, according to a new Ohio Telecom Association report. Those companies range from large service providers such as AT&T Ohio and Cincinnati Bell, with 50,000 or more phone lines, to small locally owned and operated companies with 300 to 25,000 lines.

Experts said the migration to smartphones is forcing phone line companies to diversify into new technologies such as wireless, video and broadband. It also is requiring companies that include large wireless providers to invest billions of dollars to update their networks to meet rising demand for data.

The study found that Ohio’s 11.5 million residents are using an estimated 11.8 million wireless phones, with an increasing number using different phones for home and work.

“As much as the landline market is decreasing, the broadband market is exponentially increasing,” said Stu Johnson, executive director of Connect Ohio, a nonprofit working to expand access and use of broadband Internet statewide. The challenge for telecommunications companies is to manage the transition financially while coordinating the delivery of reliable service to consumers, he said.

The average household consumes 52 gigabytes of data per month — the equivalent of more than 5 million emails or surfing the Internet for about 100 hours monthly — up from 28 GB in 2012. Part of that increase is driven by smartphones, which now account for half of all cell phones in Ohio.

“When you almost double the amount of data over your network in a year, that is a significant issue to engineer,” said Charles R. Moses, Ohio Telecom Association president. The trade group represents the state’s $20 billion telecommunications industry, which includes 42 local phone companies, three wireless carriers and more than 100 associated industry product and service providers.

Officials said the association’s member companies employ more than 20,000 Ohio residents and invest an average of $1 billion annually in the state’s telecommunications infrastructure.

Copper-wire landlines don’t offer multiple uses like broadband — an optic fiber, coaxial cable or wireless medium that has wide bandwidth characteristics and can deliver voice, video and Internet services. However, copper landlines are reliable and can serve customers in rural areas that are costly or difficult to reach with broadband technologies, Johnson said.

Ohio’s wireless-only residents are 81 percent urban or suburban; 84 percent are under age 44; 58 percent have a college education; and 63 percent earn more than $25,000 annually, according to Connect Ohio.

“The least likely to convert to a wireless-only solution would be an older, rural, less educated, lower income individual,” Johnson said. “Those are probably also the most expensive copper customers.”

Federal regulations require local phone companies to supply and maintain a phone line to every residence, business or organization that requests one.

Copper landlines also deliver services that include home health care monitoring, ATM networks and elevator emergency telephones that won’t easily transition to wireless, Johnson said.

The Department of Defense in a July 8 letter to the Federal Communications Commission warned that a rush to transition from copper landlines to wireless and Internet protocol (IP) broadband communications could disrupt the functions of important military and federal agencies, including Federal Aviation Authority air traffic communications.

These agencies continue to rely on wireline networks and services that are “critical to public safety and security — and will do so for the foreseeable future,” wrote Terrance A. Spann, general attorney for the U.S. Army Legal Services Agency.

Wireless is now a $6.8 billion industry in Ohio, according to the Ohio Telecom Association. An estimated 41.3 percent of homes in the state have eliminated local phone service and rely exclusively on wireless communications. Data plans account for 50 percent of wireless revenues.

Last week, AT&T expanded its 4G LTE (fourth generation long-term evolution) wireless network to the Dayton region to provide area customers with greater mobile Internet speeds. Verizon Wireless launched a 4G LTE network in Dayton in June 2011.

Verizon Wireless has seen “geometric growth” in wireless data demand, said Don Caretta, executive director for Verizon’s network in Ohio, West Virginia and Pennsylvania. The company has invested more than $2.5 billion in its Ohio wireless network from 2001 through 2012 to increase capacity to meet demand, he said.

New wireless applications from online movie services to so-called “smart” electric and gas meters will continue to grow data demand, Caretta said.

Similarly, AT&T has invested $1.5 billion from 2010 to 2012 into its Ohio wireline and wireless networks, said Mark Romito, the company’s director of external affairs for southwest Ohio.

In November, AT&T announced a $14 billion project to upgrade much of its wireline network to high-speed IP broadband and convert some of its harder-to-reach copper customers to 4G LTE wireless services by the end of 2015. “We are trying to address customers’ demand for data and the transition to an IP environment,” Romito said.

Ohio wireless by the numbers

90 – Percentage of Ohio residents with a wireless phone

80 – Average monthly wireless bill, in dollars

50 – Percentage of Ohio residents who own a smartphone that connects wirelessly to the Internet

41.3 – Percentage of Ohio homes that are wireless-only

40 – Additional monthly cost per phone for Internet access, in dollars

Source: Ohio Telecom Association


Hacker Magnet or Sophisticated Tool? Obamacare’s Database Debated


Government Executive

By Charles S. Clark

July 17, 2013

The centralized data hub that will link agency records on people who sign up for Obamacare is either solidly on schedule and impenetrable to hackers, or it is floundering and in danger of swelling into “the biggest data system of personal information in the history of the U.S.”

Both views were displayed Wednesday at a joint subcommittee hearing of the House Oversight and Government Reform Committee that roped in issues from cybersecurity to government mismanagement to the fate of the Affordable Care Act to the scandal over political targeting of taxpayer groups by parts of the Internal Revenue Service.

Taken together, representatives from the Health and Human Services Department, the IRS and auditing agencies appeared confident that the Obama administration can meet the health care law’s Oct. 1 deadline for opening state insurance exchanges with sufficient safeguards on personal information to protect Americans’ privacy.

The Federal Data Services Hub, a $394 million contractor operation run by the Centers for Medicare and Medicaid Services in Columbia, Md., is a tool that will allow citizens applying for the new health insurance plans to enter their income and personal identification online and get a determination of eligibility for tax credits, in many cases within seconds. The hub is designed to link databases at HHS and IRS with the Homeland Security and Veterans Affairs departments, the Social Security Administration, the Office of Personnel Management and the Peace Corps.

Hearing chairman Rep. James Lankford, R-Okla., warned that the “potential for fraud and violations of privacy are multiplied by this Rube Goldberg construction.” He cited a June Government Accountability Office report showing that only 20 percent of the privacy controls and other preparations are complete. “The government will waste billions of dollars in subsidizing taxpayers who are not eligible,” Lankford said, and proceeded to blast the IRS as being “highly politicized under this administration” citing 100 visits to the White House by previous Commissioner Doug Shulman.

But the “GAO ultimately concluded that the implementation was workable and on track,” countered Rep. Jackie Speier, D-Calif. “No major program has launched without a few hiccups,” she said, adding that CMS has contingency safeguards and “long experience with complicated health systems.” Speier praised the “dedicated federal and state government employees who are implanting the law of the land” and said she wanted to “debunk the notion that to expand health care we have to sacrifice privacy.”

Speier did, however, express worry that the “federal data hub has a bull’s eye on it, and the potential for being hacked is great.”

CMS Administrator Marilyn Tavenner said her agency is “hard at work designing and testing a secure system, and I can assure you that by Oct. 1, the health insurance markets will be open for business. I can also assure all Americans that the information they supply will be protected to the highest standards.”

Tavenner sought to dispel “confusion,” declaring that no one implementing Obamacare will ask exchange applicants for their personal health information, and that no such information will be stored on the hub. “The hub is not a database that stores, it is a routing tool to put government databases on the same system” while eliminating the need for multiple agencies to design their own, she said. CMS has designed similar systems with privacy protections in implementing the Medicare Advantage program and state Medicaid programs.

Acting IRS Commission Danny Werfel said the tax agency’s design and testing of systems to share information on income eligibility for premium assistance tax credits “is on target to be ready by Oct. 1.” He cited interagency agreements on computer matching, training and the fact that IRS had decades of experience enforcing privacy guarantees under Section 6103 of the tax code. “We have a robust set of requirements that have been battle-tested over the years,” he added, promising strong monitoring and oversight.

Henry Chao, CMS’ deputy chief information officer, cited progress since the GAO report based on April data was prepared. He assured a skeptical Lankford that his team has been testing with states in waves since February and that the amount of time an applicant’s personal information would be stored or “cached” in the hub would be measured in minutes. The information, he said, includes names of people in a household, addresses, email addresses, phone number, Social Security numbers , race and ethnicity, veteran status and, where applicable, pregnancy status, but no information on disabilities. Protections against hacking are being tested by professional experts, he added.

John Dicken, director of health care at GAO, cautioned that the Oct. 1 deadline wasn’t assured., “Much progress had been made, but much remains to be done” in testing the hubs, he said.

Left unsatisfied was Rep. Pat Meehan, R-Pa., who warned that what might be the “biggest data system of personal information in the history of the U.S.” does not appear to have been vetted by the top specialists at the FBI and DHS or at private banks. “Are you ready? Who has access? Are they competent?” he asked the CMS witnesses, raising the specter of cyber theft of intellectual property from U.S. “innovators.” “The personal information of 20 million Americans is just as important” as trade secrets, he said, and “every sector says they are only as strong as their weakest link.” This hub “is an overwhelming task that at best carries an unacceptable price tag.”

Rep. Darrell Issa, R-Calif., the committee chairman, criticized CMS for hiring Serco, a British-owned company, to help set up the exchanges, noting that the company recently was faulted for allowing exposure of thousands of internal Thrift Savings Plan records. “Where are the pilots for a company with no internal controls?” he asked.

Tavenner said Serco is a “highly skilled company with a proven track record, and won the contract in full and open competition.”

In a related drama, Rep. Jim Jordan, R-Ohio, castigated Werfel, who has been on the job for a month and a half, for substituting himself as a witness when the committee had invited Sarah Hall Ingram, who plays a coordinating role in implementing the health care law but who also helped preside over the tax-exempt division unit in Cincinnati responsible for the controversial singling out of mainly tea-party groups.

“Who is the project manager for the ACA?” asked Jordan. Hall is “in D.C. just a few blocks from here,” he noted, brandishing a document showing that Hall recently briefed an IRS oversight board on the technical implementation of the federal data hub.

Werfel disagreed with the characterizations. “At IRS we balance out a lot of factors, two of which are accountability and technical expertise,” he said. “We received an invitation to the hearing, so I suggested a combination of me and the chief technology officer. There are multiple people in the IRS with the expertise.”

Rep. Michelle Lujan Grisham, D-N.M., asked Werfel if he was concerned about the proposed 24 percent budget cut for IRS being considered Wednesday by the Appropriations Committee. “It’s extremely challenging generally, and all mission-critical activities will be severely impacted, from collecting revenue, to helping taxpayers navigate the code, to enforcement, to going after bad actors, to legislative mandates such as offshore tax evasion,” he said. Such cuts would be “extremely relevant across IRS and extremely relevant across the Affordable Care Act. The tax code doesn’t go away” with a budget cut, he said, noting cutbacks in customer service that are already underway. “In the end, it affects the taxpayer.”

DOD systems block feds from reading about NSA leaks

By Frank Konkel

Jul 18, 2013

Leaked National Security Agency documents on classified programs have garnered a lot of attention lately, but feds and contractors are not reading about them on any of the Department of Defense’s unclassified networks.

As it did in 2010 during the Wikileaks saga, DOD is using automated software to block web pages that could contain classified information, like the PowerPoint slides first published by The Guardian and Washington Post that depict the NSA’s PRISM program.
That DOD can filter and block content for its employees, service members and contractors has been public knowledge since at least 2010 when Wikileaks began revealing classified cables to the world. But media reports in late June initially claimed that some agencies within DOD were blocking access to specific websites of certain news outlets. Those were followed by more reports – including a story from The Guardian, which first broke the NSA story – that claimed only content from the Guardian website was being blocked.

DOD spokesman Damien Pickard, however, told FCW that such reports are inaccurate, and emphasized that all of DOD’s unclassified networks – millions of computers – are under the same preventative measures to mitigate unauthorized disclosures of classified information.

“The Department of Defense is not blocking any websites,” Pickart said. “We make every effort to balance the need to preserve information access with operational security; however there are strict policies and directives in place regarding protecting and handling classified information. Until declassified by appropriate officials, classified information – including material released through an unauthorized disclosure — must be treated accordingly by DOD personnel.”

DOD networks make use of software that scans sites for potential malware threats or content concerns through keyword and other filters. Any website flagged by the system that could contain classified information is blocked to users “so long as it remains classified,” Pickart said.

The measure is a department-wide directive. While the automated filters are not entirely failsafe, they mostly do what they are designed to do: Keep feds from viewing classified information before it is officially declassified, even if widely reported in the media. It is not clear, though, whether employees found intentionally accessing classified information on DOD’s unclassified networks would face consequences, or what those consequences would be.

Pickart said it is costly when classified information is accessed on unclassified DOD networks because policy dictates unclassified computers must be scrubbed of unauthorized or classified material. A single viewing of the PRISM images, for instance, could require an IT team response — and might even merit a computer’s entire hard drive being wiped.


New domain names bound for collisions: ‘Things are going to break’

Posted by William Jackson on Jul 12, 2013 at 12:38 PM

The Internet is on the brink of the largest expansion of generic Top Level Domains in its history, with as many as 1,000 new strings expected to be added over the next year, more than quadrupling the current gTLD space.

Some observers, including the operator of two of the Internet’s root zone servers, worry that this expansion of public domains could result in naming collisions with private internal network domains, disrupting those networks.

“We know things are going to break,” said Danny McPherson, chief security officer of Verisign, the company that runs the A and J root servers. Networks in the .gov domain could be affected, as well as those supporting emergency services such as public safety answering points for the nation’s 911 system. “It makes us uneasy,” McPherson said.

At risk is any enterprise with a network naming scheme using domain names for non-public resources that are the same as new domain name strings now being considered for approval on the Internet. There are 1,833 such names now being considered by the Internet Corporation for Assigned Names and Numbers, and the approved new gTLDs could begin being delegated in the root system later this year.

The resulting collisions could cause some networks to become about as useless as the Washington Beltway on Friday afternoon.

The solution is to change those internal domain names to avoid naming collisions. But this can be a complex job for a large enterprise, and McPherson worries that many administrators are not aware of the issue. He believes the 12 root zone operators have a responsibility to monitor the global systems to identify potential collision situations and warn network operators in advance. But there is no zone-wide system to provide that visibility.

Top Level Domains are the suffixes on URLs that appear to the right of the final dot in the address, such as .gov and .com. There now are 317 of these, including country names such as .us and .uk. Name servers in the Domain Name System use authoritative lists maintained in the 13 root servers to associate URLs with an IP address to direct queries. The potential problem with the domain expansion is that requests for a network’s internal domains are routinely checked against the global DNS database as well as the local enterprise name database. If the domain name is not in the global database, it looks for it in the local database, and the query is directed to the proper server within the network.

But if that internal name is added to the Internet’s collection of domains, the internal request will be sent out to the Internet and the user will not be able to access resources on his own network.

How likely is this to happen? Take .home for instance. This is a default internal domain name used on millions of pieces of home networking equipment. McPherson said .home is one of the top five queries received by Verisign’s root servers. It also is one of the most coveted new gTLDs being considered, with 11 applicants. Other commonly used internal domain names being considered for the Internet include .inc, .corp, .cloud and .mail.

McPherson also is concerned that less commonly used names such as .med that might be used by hospitals and clinics for connecting with health care equipment could suddenly become unavailable internally if .med goes onto the Internet.

Ideally, if you are managing a network you would be warned by the root zone operators when they notice local domain queries from your network that would be likely to result in collisions. With no system in place for monitoring for this, however, the responsibility falls on network administrators to know their naming schemes, pay attention to ICANN’s new gTLD program,  and make sure they are not using new Internet domains internally.

Ohio pursuing drone test center with Indiana, as FAA tackles unmanned flight

The Plain Dealer

By Tom Breckenridge

on July 13, 2013 at 8:00 PM, updated July 13, 2013 at 8:01 PM

DAYTON, Ohio — Ohio and Indiana have teamed up to pursue a federal site for research into unmanned aircraft, popularly known as drones.

The Ohio Department of Transportation, which has its own, squirrel-sized drone, is leading Ohio’s share of the effort.

The two-state team is competing with applicants from several dozen states. The Federal Aviation Administration will oversee six test sites, where experts will help the FAA figure out how unmanned aircraft — ranging in size from jetliners to model airplanes — will one day mix with the nation’s air traffic.

Ohio and Indiana have pitched a research site in Springfield, near Dayton and the Wright-Patterson Air Force Base. Indiana has a bounty of restricted air space to test unmanned vehicles, officials said.

Ohio officials recently approved $70,000 for the proposed test site. The money, from an Ohio Department of Transportation aviation fund, allows the state to lease office space the next two years at the nonprofit Advanced Virtual Test Cell, Inc., known as AVETEC.

The FAA expects to name the six sites later this year. The agency wants a mix of geographies, populations, weather and air traffic.

Winning states could gain a foothold in unmanned aircraft, a fast-growing, multibillion dollar industry that will generate thousands of high-tech jobs.

“We think it would have a great economic benefit,” said Rob Nichols, spokesman for Gov. John Kasich. “It builds on existing aerospace strengths in the Dayton region. We think leveraging this (site) could be a real boon to the area and the state.”

A recent economic-impact report from advocates of unmanned aircraft estimates the industry will generate 100,000 jobs nationwide by 2025.

Ohio will see 1,400 jobs in the industry and another 1,300 in related employment by 2025, the report said. Those jobs will have a $265 million impact on the state’s economy, according to the Association for Unmanned Vehicle Systems International, based in Arlington, Va.

Ohio was not among the top 10 states that could see the most  benefit from the fledgling industry.

Right now, unmanned aircraft are mostly known for their war-time uses. The U.S. military frequently directs bomb-laden drones to al-Qaida targets in Afghanistan.

Drones are flying in U.S. airspace but under tight control.

Uses include disaster relief, fire fighting, search-and-rescue, law enforcement and border patrols, the FAA says.

ODOT uses a 2-foot-wide drone that carries a specialized camera for surveying land, said spokesman Steve Faulkner. ODOT has used the $15,000 device about 25 times for smaller projects, like bridge repairs and road resurfacing.

It replaces airplanes, Faulkner said.

“We can use this for much less cost, as opposed to fueling up an airplane,” Faulkner said.

ODOT selected the Springfield site from 15 around the Dayton area.

The AVETEC center affords about 2,000 square feet and room for 15 workers. The site meets federal and state specifications, including broadband connections with others in the aerospace industry; computers for modeling and simulations; and the ability to meet U.S. Defense Department requirements for top security.

Nichols, the Kasich spokesman, said AVETEC is already in use as part of a national challenge grant for unmanned vehicles, offered by the National Space and Aeronautics Administration.

A Dayton-area nonprofit, Development Projects, Inc., is overseeing competition for the NASA grant, which features a $500,000 prize for the development of technology that enables drones to sense and avoid other aircraft.

NASA’s Glenn Research Center in Brook Park is part of the aerospace portfolio that hopefully convinces the FAA to set up a test center in Ohio, said Mike Heil, president and chief executive officer of the Ohio Aerospace Institute, a trade group based near NASA Glenn.

NASA Glenn’s expertise in space communications is a boon to unmanned aircraft, Heil said. And Northeast Ohio has “lots of companies” that will benefit from the growing industry, Heil said.

With the development of drones comes concern about privacy and government surveillance. Along with crafting new regulations for unmanned flight, the FAA is developing a privacy policy that “emphasizes transparency, public engagement and compliance with existing law,” according to an FAA news release.

Nichols said the state is working with the law school at the University of Dayton to craft “very strict and cutting-edge privacy rules and requirements” for drone use in the state.


What They Told Us: Reviewing Last Week’s Key Polls

Rasmussen Reports

Saturday, July 20, 2013

Reactions to the George Zimmerman verdict highlight how wide the racial divide remains in America.

At week’s end, 44% of Americans agreed with the Florida jury’s finding Zimmerman not guilty in the shooting death of black teenager Trayvon Martin,  down from 48% earlier in the week.   Thirty-five percent (35%) disagreed, but that included 80% of black Americans.

The verdict was reached by an all-white jury of six women. Sixty-five percent (65%) of blacks do not think it is possible for an all-white jury to fairly decide a case involving the shooting death of a black man. Seventy percent (70%) of whites think it is possible.

Among those who agree with the jury verdict, most believe Zimmerman was innocent and acting in self-defense. However, nearly a third believe neither side was able to prove its case and that the jury verdict merely reflected the fact that Americans are innocent until proven guilty.

Overall, that means 35% think Zimmerman should have been found guilty, 29% believe he was innocent, 21% are not sure, and 13% agree with the jury verdict because neither side proved its case beyond a reasonable doubt.

Just 32% of Americans now have a favorable opinion of Zimmerman, while 48% view him unfavorably.

Still, only 24% believe Zimmerman’s actions that led to Martin’s death were motivated primarily by racism. Just 21% believe the Hispanic neighborhood watch volunteer should now be charged with a hate crime by the U.S. Department of Justice.

But on virtually every question related to this case, there is a wide racial difference of opinion.

Currently, 32% of voters believe American society is generally unfair and discriminatory. That’s near the highest level ever recorded.

Just 30% say the country is heading in the right direction, and voters remain pessimistic about the nation’s future. Thirty-six percent (36%) think America’s best days are still to come, but 49% think those days have come and gone. This ties the highest level of pessimism in nearly a year.

Confidence in the nation’s safety against terrorism is at its lowest level in several years. Only 39% of voters believe the United States today is safer than it was before the terrorist attacks on September 11, 2001. Confidence jumped to 51% after Osama bin Laden’s death in May 2011 and hadn’t fallen below 40% until now.

Fifty-nine percent (59%) of Americans continue to feel that Boston Marathon bombing suspect Dzhokhar Tsarnaev should receive the death penalty if convicted. Fifty-three percent (53%) say it’s at least somewhat likely that he will receive the death penalty.

President Obama’s job approval ratings remain at levels seen throughout most of his first term, down from the bounce they experienced just after his reelection.  Positive assessments of his leadership have fallen to their lowest level in over a year of regular tracking. Forty-three percent (43%) of voters now consider the president a good or excellent leader.  Thirty-eight percent (38%) give him poor marks for leadership.

Just 45% hold a favorable opinion of the president’s national health care law. Voters continue to believe costs will go up and quality with suffer under the new law. Eighty percent (80%) think the law is likely to cost more than official estimates.

By a two-to-one margin, voters agree with the House’s decision this week to delay for one year the law’s requirement that every American buy or obtain health insurance.

Voters strongly disagree with the Obama administration’s decision to make it easier for low-income Americans to qualify for health insurance subsidies under the new health care law. Eighty-six percent (86%) think these individuals should be forced to prove they are eligible by documenting their income and their lack of access to insurance.

Forty-four percent (44%) of voters now rate the president’s handling of health care issues as good or excellent. That’s up from June’s low of 38% but more in line with regular surveying earlier this year. Forty-three percent (43%) rate the president poorly on health care.

Twenty-one percent (21%) believe the president’s new regulations on the coal industry will help the economy, but twice as many (41%) think those regulations will hurt the economy instead. Interestingly, voters now view the U.S. coal industry more favorably than the Environmental Protection Agency and are closely divided when asked if the Obama administration’s ultimate goal is to kill that industry.

 Most voters believe that all new EPA regulations should require congressional approval before going into effect. But then 60% think it is more important to preserve our constitutional system of checks and balances than it is for government to operate efficiently.

“Many in politics act as if the end result is creating a government that works,” Scott Rasmussen notes. “However, the real goal should be to create a society that works. A system of careful checks and balances may frustrate political activists from both parties, but it protects the American people from over-zealous politicians and the demagoguery of passing political fads.”

Senate Democrats threatened Republicans this week with changing long-standing Senate rules and effectively eliminating the minority’s filibuster power to delay certain nominations and votes, but the two sides reached a last-minute compromise. Forty-four percent (44%) favor the proposed elimination of the filibuster; 38% oppose it. The filibuster is not a constitutional protection.

For the third week in a row, Republicans hold a one-point lead over Democrats on the Generic Congressional Ballot. 

Consumer and investor confidence remains down from a month ago but is still near its highest levels in several years. 

In other surveys this week:

– Forty-three percent (43%) of Americans now think the spike in food stamp recipients is chiefly because government rules have made it easier to get food stamps. Fifty percent (50%) believe it is too easy to get food stamps in this country, a 10-point increase from December 2010.

– Most voters think it’s possible for the United States to achieve energy independence through shale oil development and government-funded programs to promote alternative energy sources. 

– Forty-four percent (44%) of voters favor a ban on abortion after 20 weeks. Forty-one percent (41%) oppose such a ban. Forty-eight percent (48%) favor a law that would require doctors to have admitting privileges at nearby hospitals before they can perform abortions.

– Fifty-nine percent (59%) of Americas believe government subsidies should be used to keep costs lower on student loan interest rates. But 81% think lowering tuition costs would do more to help college students than giving them easier access to student loans. Fifty-three percent (53%), in fact, believe the availability of student loans has actually helped increase the cost of college tuition.

– Three-out-of-four Americans still prefer a traditional book over an electronic book-reading device and continue to reads books that way. 

– Fifty-six percent (56%) say they rarely or never eat a meal from a fast food restaurant during a typical week. 


July 13 2013




Snowden affair clouds U.S. attempts to press China to curb cyber theft


Mon, Jul 8 2013

By Paul Eckert


WASHINGTON (Reuters) – Revelations by former U.S. spy agency contractor Edward Snowden will make it harder for the United States to confront China at talks this week over the alleged cyber theft of trade secrets worth hundreds of billions of dollars each year.

Snowden’s disclosures of American electronic surveillance around the world give China an argument to counter U.S. complaints that it steals private intellectual property (IP) from U.S. companies and research centers.

Cyber security is at the center of high-level meetings between the two countries in Washington that will show whether a positive tone struck by President Barack Obama and new Chinese President Xi Jinping at a summit last month can translate into cooperation on difficult issues.

Top U.S. officials from Obama down have long tried to convince China to recognize a clear line between the kind of cyber espionage by spy agencies revealed by Snowden and the stealing of technology.

“This Snowden thing has muddied the waters in a terrible way,” said James McGregor, author of a book on China’s authoritarian capitalism and industrial policy.

“China would rather have the waters muddy, because they can say ‘You do it. We do it. What’s the big deal?’ and the cyber theft against companies will go on and on,” he said by telephone from China, where he is senior counselor for APCO Worldwide, a U.S. business consultancy.

Treasury Secretary Jack Lew said last week that U.S. officials will press China at the talks on cyber theft, a problem he described as “just different from other kinds of issues in the cyber area.

Many countries spy on each other, but U.S. officials say China is unique in the amount of state-sponsored IP theft it carries out as it tries to catch up with the United States in economic power and technological prowess.

Last week the U.S. Department of Justice charged Chinese wind turbine maker Sinovel Wind Group Co and two of its employees with stealing software source coding from U.S.-based AMSC in an alleged theft worth $800 million.

The U.S. Chamber of Commerce hopes “to see a clear indication that China recognizes thefts of trade secrets, whether by cyber or other means, is stealing property and will bring the full force of its laws to curb this,” said Jeremie Waterman, the group’s senior director for Greater China.

Beijing regularly parries complaints about Chinese hacking into the computers of U.S. businesses by saying that China is itself a major victim of cyber espionage. Chinese officials have dismissed as unconvincing recent U.S. official and private-sector reports attributing large-scale hacking of American networks to China.

China’s official Xinhua news agency last month said the Snowden case showed the United States was “the biggest villain in our age” and a hypocrite for complaining about Chinese cyber attacks.


China’s stance appears to be bolstered by Snowden’s revelations of widespread surveillance by the National Security Agency and his assertion that the agency hacked into critical network infrastructure at universities in China and in Hong Kong.

Snowden first fled to Hong Kong before his leaks to newspapers became public last month, and has subsequently gone to Moscow. He is believed to be holed up in the transit area of the city’s Sheremetyevo International Airport and has been trying to find a country that would give him sanctuary.



Now in their fifth year, the annual U.S.-Chinese talks, known as the Strategic and Economic Dialogue, will cover topics from U.S. concerns about North Korea’s nuclear weapons and expanding U.S.-China military ties to climate change and access to Chinese financial markets.

China’s exchange-rate policy is on the agenda, although it has receded as a issue with the gradual strengthening of the yuan and a reduction of huge current account imbalances.

This year Secretary of State John Kerry and Lew host Chinese State Councilor Yang Jiechi and Vice Premier Wang Yang for the first S&ED session since China’s once-a-decade leadership change in March when Xi took over.

The meetings follow Obama’s summit last month with Xi in California, where the two men developed what aides called a productive relationship. Nevertheless, Obama demanded Chinese action to halt what he called “out of bounds” cyber spying.

Civilian and military officials from the two countries discussed international law and practices in cyberspace at low-level talks on Monday. Cyber security is due to come up at other meetings throughout the week that will also likely address U.S. accusations that Beijing gained access electronically to Pentagon weapons designs.

IP theft costs U.S. businesses $320 billion a year, equal to the annual worth of U.S. exports to Asia, authors of a recent report say.

A bipartisan group of high-ranking former U.S. officials known as the Commission on the Theft of American Intellectual Property said in a May report that China accounts for between 50 percent and 80 percent of IP theft suffered by U.S. firms.

Cyber theft of industrial designs, business strategies and trade secrets is only a portion of IP pilfering.

IP theft more commonly involves “planted employees, bribed employees, employees who were appealed to on the basis of nationalism and all the traditional means of espionage, often accompanied by cyber,” said Richard Ellings, president of the National Bureau of Asian Research think tank, who co-wrote the report.

The U.S. District Court in Manhattan charged three New York University researchers in May with conspiring to take bribes from Chinese medical and research outfits for details about NYU research into magnetic resonance imaging technology.

Arrests by U.S. Immigration and Customs Enforcement and the Homeland Security Department for IP infringements rose 159 percent and indictments increased 264 percent from 2009-13, according to a report released in June by the U.S. Intellectual Property Enforcement Coordinator.

The Commission on the Theft of American Intellectual Property called for tough penalties including banking sanctions, bans on imports and blacklisting in U.S. financial markets.



Special Report: Cyber Priorities

Snowden Incident Returns Spotlight to Employee Danger


Defense News

Jul. 9, 2013 – 06:00AM |



WASHINGTON — Edward Snowden, the leaker currently stuck in Russia who disclosed a wide range of secrets about US government surveillance and spying, has changed the conversation about cybersecurity. Not because of the documents he released, but as a reminder of the vulnerability organizations have to the threat of insiders with access to large swathes of information and system components.

It’s a lesson that was the talk of the cyber community following the WikiLeaks disclosures through the alleged actions of Bradley Manning that faded as experts began to focus on the growing threat of foreign governments, particu­larly China. It is back in vogue because of the volume and sensitivity of information Snowden has made public.

Some of the fallout from the Manning case, such as the banning of thumb drives and other external media from sensitive systems, has been walked back in some instances in the name of practicality. One of the problems, as is the case with any security issue, is you can’t make a network truly safe from an insider.

“It’s akin almost to insider attacks in Afghanistan,” Army Gen. Martin Dempsey, chairman of the US Joint Chiefs of Staff, said during a late June speech. “Well, the answer is that you can’t prevent it. You can mitigate the risk, and what I’d like you to take away from this conversation about the incident with Snowden is you can’t stop someone from breaking the law 100 percent of the time. You just can’t stop that from happening.”

Dempsey did, however, suggest steps to reduce the threat of insiders to Defense Department networks, including cutting the number of people in positions like Snowden’s.

“I think systems administrators is the right place to begin to clean this up because they have such ubiquitous access, and that’s how he ended up doing what he did,” he said. “We really need to take advantage of thin client and cloud technology, to dramatically reduce the number of systems administrators that we have managing programs, which will make it both more effective and safer.”

That approach carries risk because fewer individuals will have access concentrated in their hands, said Jeff Moulton, director of information operations at Georgia Tech Research Institute.

“What they’ve done now is rather than mitigating the threat, they’ve increased the likelihood of a catastrophic impact from a threat,” he said. “It’s not going to help. It introduces other problems, like the broader access of the cloud.”

One idea suggested by several cyber experts, including Moulton, is to adopt nuclear launch security as a guide. When it comes to the use of nuclear weapons, two separate individuals have to provide authentication before a weapon can be used. Not only does this prevent accidents, but it guarantees that a second person will be monitoring the activity of the first.

In the cyber realm, this could be achieved by requiring two people to provide their security credentials before either could access certain kinds of documents or segments of the network control system.

“Is it time consuming? Perhaps,” Moulton said. “But what’s more time consuming, doing this or armchair quarterbacking?”

Still, there will always be a residual threat from insiders, which is why deterrence is key, said Ian Wallace, a visiting fellow with the Brookings Institution and a former official with the British Ministry of Defence.

“The insider threat will always exist, and it will be next to impossible to stop it completely,” Wallace said. “But there are also plenty of ways in which that can be deterred. Not the least of those is the traditional deterrent of getting caught and prosecuted, something which is even more likely with the emergence of companies doing big data analysis of behavior on their own systems.”

Wallace cautioned that all of this attention on the insider threat may be misguided. Statistically, insider attacks are exceedingly rare, even if the data that is lost or the risk to systems from a determined insider is significant.

“All of the evidence that I have heard from the best cybersecurity firms suggests that the main threat is still the remote threat, for three compelling reasons: the risk of being caught is much less, it is much more scalable, and at present it is still, sadly, relatively easy for a sophisticated and determined intruder to get into all but the best protected systems,” Wallace said.

In the hunt for solutions to the insider threat, one of the big questions is how to detect intent from an employee ahead of a problem. In much the same way that concerns have surfaced about what radicalized the Boston bombing suspects and whether it could have been detected earlier, experts are studying how to discover the intentions of insider threats sooner.

That can take the form of such mundane facts as the speed at which an employee types. Changes in the rate of typing can indicate mood, a tip that further inquiry might be needed.

But to gain that type of data, a certain degree of invasiveness is required, and some superficial profiling of behavior is employed.

That creates all kinds of legal and ethical questions but may be a necessity for large organizations with many people to monitor, Moulton said.

“You can’t monitor everybody all the time,” he said. “Look at what the casinos do. They profile, but that’s a really difficult word. Are we prepared to profile?”

Dempsey emphasized that some actions would be taken to improve the system, but he described a certain degree of risk acceptance.

“You can certainly increase the scrutiny in terms of their background investigations, you can reduce the number of them you get, there are different degrees of oversight in place,” he said. “But at some point, if somebody is going to break the law and commit an act of treason, I don’t know what he’ll eventually be charged with or espionage, they’re going to be able to do that.”



DOD building its own secure 4G wireless network

By Kathleen Hickey

Jul 03, 2013


The Defense Department expects to have its own secure 4G wireless network up and running by the middle of next year, hosting a variety of iPhones, iPads and Android devices.

The network is part of DOD’s four-year, $23 billion dollar investment in cybersecurity, which also calls for hiring an additional 4,000 people for its cyber workforce, establishing common standards and improving coordination in investing and managing cyber resources, Gen. Martin Dempsey, chairman of the U.S. Joint Chiefs of Staff, said in a recent speech given at the Brookings Institution.

Dempsey said he had a secure mobile phone that “would make both Batman and James Bond jealous.”

Dempsey also spoke about creating a federal app store using off-the-shelf technology to “allow any DOD user to write and share phone and tablet apps.” On June 28, the Defense Information Systems Agency announced it awarded Digital Management, Inc. a $16 million contract to build the DOD’s first enterprisewide mobile application store and mobile device management system.

The secure 4G network is part of the DOD’s Joint Information Environment initiative to consolidate its 15,000 networks into a cloud environment.

“The new Joint Information Environment will deepen collaboration across the services and mission areas. It will also be significantly more secure, helping ensure the integrity of our battle systems in the face of disruption,” said Dempsey.

A few news outlets, such as TechInvestorNews, speculated whether the network was a ploy by DOD to exclude itself from the National Security Agency’s surveillance program, since its calls would not go through Verizon or other commercial carriers from which NSA collects metadata.

But the network could also just be a sign of DOD recognizing the growing importance of mobile computing. The military has long had its own non-classified and classified IP networks — NIPRnet and SIPRnet. As it uses more smart phones and tablets, that approach to security is extending to mobile.

Since Dempsey was appointed chairman in 2011, critical infrastructure attacks have increased 17-fold, he said at Brookings, although he did not specify the exact number of attacks, nor how many occurred prior to his taking office.

“Cyber has escalated from an issue of moderate concern to one of the most serious threats to our national security,” he said. And in addition to military systems, securing civilian infrastructure and businesses, such as those in the banking, chemical, electrical, water and transport sectors, is vitally important.

“Although we have made significant progress embracing cyber within the military, our nation’s effort to protect civilian critical infrastructure is lagging,” Dempsey said. “Too few companies have invested adequately in cybersecurity.”

“One of the most important ways we can strengthen cybersecurity across the private sector is by sharing threat information. Right now, threat information primarily runs in one direction — from the government to operators of critical infrastructure. Very little information flows back to the government,” he said. “This must change. We can’t stop an attack we can’t see.”


Commentary: Can Driverless Cars Save the Postal Service?

By Samra Kasim and Matt Caccavale

July 5, 2013

Ding! That sound could soon be the USPS app alerting you to an imminent delivery, after which a driverless Postal Service vehicle arrives at your door and a robotic arm delivers your package.

While this may sound like science fiction, driverless vehicles will be coming to streets near you sooner than you may think. Sixteen states already have introduced driverless vehicle legislation and California, Nevada, Florida, and the District of Columbia have enacted laws allowing driverless vehicles on their roads. Sergey Brin, co-founder of Google and a driverless vehicle advocate, forecasts fully autonomous vehicles will be available for sale in five years.

Driverless vehicles have the potential to transform many enterprises focused on transporting goods. The Postal Service’s fleet of 215,000 vehicles traveled over 1.3 billion miles in 2012, roughly equivalent to circumnavigating the globe 172 times every business day. Driverless vehicles could reduce operating costs through increased safety, fuel efficiency, and new business models. After posting a quarterly loss of $1.9 billion in May, it’s time for USPS to explore reinvention.

Think about what a day in the life of a USPS driverless vehicle might look like:

12:18 a.m. The latest software package with updated mapping information and the day’s optimized delivery route is downloaded directly from fleet headquarters.

12:30 a.m. The vehicle begins delivery on its suburban route — the pre-determined optimal time for mail delivery on that particular day.

5:00 a.m. A local bakery’s two-hour reservation through USPS’s CloudCar program begins and the vehicle delivers bread to grocers around town. Since the bakery owner no longer has to maintain his own fleet of delivery trucks, he can hire two more bakers and double production.  

7:22 a.m. The vehicle stops at a full service gas station, refuels and reports a maintenance diagnostic assessment to fleet headquarters, allowing USPS to forecast maintenance requirements and plan accordingly.

11:13 a.m. After completing initial deliveries, the car is identified as available. Just then, a business executive pulls up the USPS mobile app on her phone, checks-in at her current location and orders a rush delivery of a time sensitive document.

3:15 p.m. While en route, the car’s sensors detect a large pothole, triggering an automatic report to the local transportation department with geotagged images of the hazard.

4:18 p.m. A businessman suddenly remembers that today is his anniversary. He places an order at a local florist, who has an urgent delivery contract with USPS’s new dynamic pricing system. The vehicle stops at the florist and is then routed to the spouse’s residence.

7:14 p.m. After completing its custom delivery orders and returning to the USPS regional warehouse, the vehicle sends its daily diagnostic report to fleet headquarters, and begins the next round of deliveries.

While this is only a thought experiment, the potential for new operating models and cost savings is very real.

Removing the driver from a vehicle enables it to be used around-the-clock. Routes could be designed around optimal traffic patterns and delivery needs. Driverless vehicles also could be used as a shared service with other businesses and government agencies leasing time when the vehicles are available, similar to the Uber Taxi model. With its significant vehicle fleet and 42,000 ZIP code reach, the Postal Service is well positioned to pilot new service models. It could, for instance, coordinate with auto manufacturers and the State of California to test the readiness of its highways for driverless cars.

Driverless vehicles also have the potential to reduce vehicle operating costs. In 2012, Google reported that after driving 300,000 miles, its driverless cars were not involved in any accidents. Computer control of cars mitigates against human error, such as fatigue or distraction, leading to greater safety. Vehicle accidents and wear-and-tear create significant operating costs for large enterprises like USPS. In FY 2011 alone, USPS had over 20,000 motor vehicles accidents. According to OSHA, the average vehicle crash costs an employer $16,500. The average cost skyrockets to $74,000 when an employee has an on-the-job crash resulting in injury. With fewer vehicle-related accidents, USPS could see substantial cost savings.

As gas prices continue to climb, fuel is another major cost for large fleet operators. The Postal Service spent nearly $500 million in 2011 and required $614 million in maintenance. With an average vehicle age of 16 years, fuel and maintenance costs will continue to climb. A Columbia University study identified that “cars simply managing their own speed would increase efficiency by an appreciable 43 percent.”  Further, the study estimated that once there are more driverless vehicles on the road that are able to platoon with each other, energy savings may jump to 273 percent.

Federal agencies have long promoted innovative technologies, from GPS to the Internet. As the largest purchaser of goods and services and operator of the largest vehicle fleet in the world, the federal government and USPS have the potential to usher in the driverless car revolution.


Sources: DoD Considers 3 Options for JIEDDO

Defense News

Jul. 6, 2013 – 06:00AM |


WASHINGTON — Senior US defense officials are preparing to determine the future of a powerful, high-profile Pentagon organization that has spent nearly a decade developing equipment, tactics and training to defeat roadside bombs.

Last month, House lawmakers included a provision in their version of the 2014 defense authorization bill that requires the Defense Department to provide a report on the future of the Joint Improvised Explosive Device Defeat Organization (JIEDDO).

At a time when the Pentagon is facing hundreds of billions of dollars in spending cuts over the next decade, senior military leadership is said to be considering three options for restructuring JIEDDO: eliminate the organization; break up its duties among the military services through a process called disaggregation; or restructure JIEDDO into a smaller office within the Office of the Secretary of Defense (OSD).

In March 2011, then-Defense Secretary Robert Gates called for the elimination of the JIEDDO director billet, a position held by four different three-star generals since 2008. The elimination would be “based upon deployment of forces and IED threat,” Gates wrote in a memo at the time.

But supporters of JIEDDO said the counter-IED mission must be preserved through the Quadrennial Defense Review, which lays out future US military strategy and is due to Congress early next year. These supporters point to recent intelligence assessments that say terrorist networks will continue to use IEDs against the United States and its allies.

“We have to realize that the IED is part of our operational environment now,” said retired Army Command Sgt. Maj.Todd Burnett, a former senior enlisted adviser to JIEDDO.

A May Center for Naval Analyses assessment of the “post-Afghanistan IED threat” found the IED will likely persist in the coming years.

With that in mind, JIEDDO supporters argue that the third option — creating a smaller office within OSD — would be best.

“DoD needs a small, scalable, agile, OSD-level organization with special authorities, ramp-up ability and flexible funding to implement and synchronize … enduring counter-IED capabilities,” a defense official said.

Since its birth in 2006, JIEDDO has spent about $20 billion, according to budget documents. Spending peaked near $4 billion in 2008, around the time of the surge in Iraq. Since then, spending has declined to about $2 billion. A scaled-down counter-IED organization would likely cost about one-fourth of that, a defense official said.

Officials close to JIEDDO said the office has already cut costs, and they point to the cancellation this year of a number of underperforming programs.

These cancellations have allowed the office to reinvest more than $289 million in training and to purchase reconnaissance robots and bomb-detection equipment. The JIEDDO office is expected to cut 22 percent of its staff by September, a reduction expected to save $163 million.

The majority of the money spent by JIEDDO has gone toward what it calls defeating the device, or purchasing systems and equipment to detect or protect soldiers from IEDs. This includes purchases of robots, electronic jammers, vehicles and even aerostats.

The equipment includes both US and foreign-made systems, such as more than 800 British-built Self-Protection Adaptive Roller Kits, giant rollers that can be mounted on vehicles to detect roadside bombs

The rest of the funding has gone toward intelligence used to go after IED networks and training equipment.


The Options on the Table

In January, the Joint Requirements Oversight Council, a panel that vets military requirements, said the Pentagon must maintain counter-IED capabilities, including the ability to identify threat networks that employ or facilitate IEDs, detect bombs and components, prevent or neutralize bombs, mitigate explosive device efforts, distribute bomb-related data across the the community of interest and train personnel in counter-IED capabilities.

Since then, three options have emerged as likely courses of action, sources say.

The first — eliminating JIEDDO and its mission — is not likely, a defense official said. The two more likely courses of action are scaling down the existing organization or delegating the training and equipping mission to the services through disaggregation.

If the disaggregation option is chosen, many of JIEDDO’s components could be split among the services, with acquisition authority most likely going to the Army, the official said.

JIEDDO reports to OSD and has special acquisition authority, allowing decisions and purchases to move quicker.

Through disaggregation, each of the services would likely be responsible for its own training, which supporters of JIEDDO said means different methods and equipment might be used.

Also unclear is how the intelligence apparatus within the organization would be restructured.

The other option is consolidating JIEDDO into a smaller OSD-level organization. An organization under this framework would be best equipped to rapidly procure counter-IED equipment, officials said. Special acquisition authority used by JIEDDO could be applied to this organization, allowing it to field equipment, quicker.

JIEDDO’s goal is to field what it calls capabilities in four to 24 months. After that time frame, the initiatives typically become official programs of record or terminated.

A review of 132 initiatives deployed showed that 93 — with a total price tag of $5.9 billion — were proved “operationally effective.” An additional 18, costing $900 million, were “operationally effective with some limitations in capability.” An additional 21 — totaling $400 million — were “not operationally proven,” or lacked evaluation information.

A key aspect of JIEDDO likely to be retained in a consolidated organization is the Counter-IED Operations/Intelligence Center (COIC). The center provides operational intelligence and analysis on threat networks to commanders in the field by fusing more than six dozen data sources.

The COIC also regularly interacts with more than two dozen US government intelligence agencies and international partners, including Canada, the UK, Australia and NATO.


An International Problem

IEDs are seen as a threat globally, not just in places like Iraq and Afghanistan. Since January 2011, more than 17,000 IED “events” have occurred in 123 countries, according to David Small, a JIEDDO spokesman. Outside Afghanistan, there are an average of 700 IED events each month.

Between December 2012 and May, Iraq experienced 3,352 incidents, the most of any country other than Afghanistan. Colombia experienced 1,005 during that period, with Pakistan third at 883. Syria, which has been in the midst of a civil war, has experienced 382 IED incidents.

In May, JIEDDO signed an agreement with Pakistan to minimize the IED threat. The arrangement allows sharing of information, including tactics, finding of IED incidents, lessons learned, information about IED financiers and information about the flow of IED materials.

Joe Gould contributed to this report.



Activity-Based Intelligence Uses Metadata to Map Adversary Networks

Defense News

Jul. 8, 2013 – 02:59PM |

By Gabriel Miller     


Few outside the intelligence community had heard of activity-based intelligence until December, when the National Geospatial Intelligence Agency awarded BAE Systems $60 million to develop products based on this newish methodology. But ABI, which focuses not on specific targets but on events, movements and transactions in a given area, is rapidly emerging as a powerful tool for understanding adversary networks and solving quandaries presented by asymmetrical warfare and big data.

Indeed, ABI is the type of intelligence tool that could be applied to the vast wash of metadata and internet transactions gathered by the NSA programs that were disclosed in June by a whistle-blower.

In May, the U.S. Geospatial Intelligence Foundation’s Activity-Based Intelligence Working Group hosted a top-secret forum on ABI that drew representatives from the “big five” U.S. intelligence agencies.

At the SPIE 2013 Defense, Security + Sensing Symposium on May 1, NGA Director Letitia Long said the agency is using ABI to “identify patterns, trends, networks and relationships hidden within large data collections from multiple sources: full-motion video, multispectral imagery, infrared, radar, foundation data, as well as SIGINT, HUMINT and MASINT information.”

The technique appears to have emerged when special operators in Iraq and Afghanistan reached back to NGA analysts for help plugging gaps in tactical intelligence with information from national-level agencies. These analysts began compiling information from other intelligence disciplines — everything from signals intelligence and human intelligence to open sources and political reporting — and geotagging it all. The resulting database could be queried with new information and used to connect locations and establish a network.

This experience led to a series of seminal white papers published in 2010 and 2011 by the Office of the Undersecretary of Defense for Intelligence. The papers call ABI “a discipline of intelligence where the analysis and subsequent collection is focused on the activity and transactions associated with an entity, population, or area of interest.”

This focus on interactions is the fundamental difference between ABI and previous efforts to integrate different types of intelligence, which were often confined to a single agency and aimed at a specific target.

“When we are target-based, we focus on collecting the target and, too often, we are biased toward what we know and not looking for the unknown,” NGA’s Dave Gauthier said last year at GEOINT 2012. Gauthier, who handles strategic capabilities in the agency’s Office of Special Programs, called ABI “a rich new data source for observing the world and the connectedness between objects and entities in the world.”

ABI attempts to meet two challenges with traditional intelligence-gathering. First, there are no clear signatures for and no doctrine governing the activities of nonstate actors and insurgents who have emerged as the most important threats to U.S. national security. Second, the volume of big data has become “staggering,” in Gauthier’s words. Take, for example, the recent bombing in Boston: There was a massive amount of surveillance imagery available, but analysts initially had no idea whom they were looking for, and moreover, the suspects turned out to look little different from thousands of other spectators on hand.


“ABI came out of the realization that the scheduled, targeted, one-thing-at-a-time, stove-piped analysis and collection paradigm was not relevant to non-nation-state and emergent threats,” said Patrick Biltgen, a senior engineer in the intelligence and security sector at BAE Systems. “We are breaking this one-thing-after-another paradigm because information is flowing … all the time and we don’t know what to do with it because if you’ve stopped to try and collect it, you’ve missed everything else that’s coming.”



Though the USD(I) white papers call ABI a new discipline, many prefer to think of it more as a methodology with several components.

The first is the constant collection of data on activities in a given area, then storing it in a database for later metadata searches. The NGA’s Long recently said the agency is working to create a “model that allows us to ‘georeference’ all of the data we collect persistently — over a long period of time,” one that allows “analysts to identify and evaluate data down to the smallest available object or entity.”

The second is the concept of “sequence neutrality,” also called “integration before analysis.”

“We collect stuff without knowing whether it’s going to be relevant or not. We may find the answer before we know the question,” said Gregory Treverton, who directs the Rand Center for Global Risk and Security. “It’s also not so driven by collection; the collection is just going to be there.”

The third is data neutrality — the idea that open-source information may be just as valuable as HUMINT or classified intelligence.

“Humans, unlike other entities, are inherently self-documenting. Simply being born or going to school, being employed, or traveling creates a vast amount of potentially useful data about an individual,” the white papers say. This tendency has exploded on the Internet, “where individuals and groups willingly provide volumes of data about themselves in real time — Twitter and social network forums like Facebook and LinkedIn are only a few examples of the massive amounts of unclassified data that is routinely indexed and discoverable.”

Finally, there is knowledge management, which covers everything from the technical architecture that makes integrated intelligence and information-sharing possible to the metadata tagging that allows analysts to discover data that may be important, but not linked spatially or temporally.



ABI products take the form of customizable Web-based interfaces that allow analysts to locate associations among data sets using metadata.

“You could call them Web services, apps, widgets, but they help analysts sift through large volumes of data,” said BAE Systems’ Biltgen.

These do not compete with giant systems like the armed services’ Distributed Common Ground Systems, end-to-end databases that connect thousands of users with intelligence information. Rather, they are generally designed to plug into DCGS, then help smaller working groups deal with specific problems.

“Really, what we’re doing is working with the metadata — the dots and the indexes and extracted ‘ABI things’ — to get those on the screen, whereas the large systems really manage streams of imagery for exploration,” Biltgen said. “We go, ‘Let’s take clip marks and the tags that come from exploited video streams and look at all of them at the same time without ever having to touch a frame of video.’ “

He said the goal is to “precondition the data and make it easier for the analyst to correlate them, apply their cultural awareness and knowledge to them, and really put the thought muscle on the data after it’s been well conditioned.”

So what does ABI actually produce? One common format is activity layer plots. An analyst might, for example, place all available intelligence about an explosion of an improvised explosive device atop information about a kidnapping in the same area, then lay in data about the local bus line, the fruit market at the corner, or the local timber-smuggling operation.Once displayed, the information may overlap or intersect in interesting ways.

To date, ABI has primarily been used in the kinds of operations that have defined Iraq and Afghanistan: manhunting and uncovering insurgent networks. But because ABI is more a methodology than a discipline, and because the products that enable ABI are customizable, the intelligence community sees ABI applied to a broad range of problems.

“The immediate question is, can we expand it beyond counterterrorism and manhunting and the fight against terror?” Treverton said.

He suggested applications such as maritime domain awareness, in which signatures exist for Chinese frigates but not junks.

ABI can theoretically be brought to bear on any problem that might be aided by a “pattern of life” analysis, a prominent phrase in the white papers. In finance, for example, ABI might identify patterns left by a particular kind of criminal.

“You could use this in the insurance industry to try and understand the patterns of life of individuals that steal things from you and make false claims. We do some of that work today,” Biltgen said.

While ABI can help anticipate patterns, advocates don’t claim it can predict future behavior.

“I wouldn’t call it predictive,” Treverton said. “I wouldn’t call anything predictive. That’s asking way too much.”

Still, it may help officials anticipate threats by building a deep understanding of the networks that give rise to specific incidents.



Two things could hinder ABI — one technical, one cultural.

It sounds relatively uncomplicated to develop a visual network, say, by tracing all of the tire tracks captured by wide-area motion video in a given area over a period of time. Origins and destinations become nodes, and hundreds or even thousands of tire tracks describe a network from which analysts can extract meaning. But the devil is in the details. For example, it is difficult to define a “vehicle stop” in an algorithm, much less assign meaning to it. Does a “stop” last five seconds or one minute?

“It sounds easy, until you touch the data. You realize that every proposition in that value chain has hidden complexity,” said Gary Condon, an intelligence expert at MIT’s Lincoln Lab, at GEOINT 2012.

The second set of issues are cultural. Even in the post-9/11 era, legal boundaries and security clearances can prevent the kind of data-sharing that makes ABI work. The quantity of publicly available information swells by the day, but the intelligence community still often prizes classified over open-source information. And just as complex: Some of that open-source intelligence raises privacy concerns when U.S. persons are involved.

That’s been at the heart of the outcry over the NSA’s Prism program and phone-record collection.

Still, top-level intelligence officials see ABI as a valuable new tool. Several senior officials from the Office of the Director of National Intelligence remarked on its growing importance at the U.S. Geospatial Intelligence Foundation forum in early May.
“The defense and intelligence worlds have undergone, and are still undergoing, a radical transformation since the events of 9/11. The Department of Defense and the Director of National Intelligence have made information sharing and efficiency priorities,” the spokesman said. “This will increase collaboration and coordination, which will have a multiplying effect on approaches such as ABI.”



Analysis: Policies and Opportunities That Will Shape Cybersecurity Spending

Special to Homeland Security Today

By: Stephanie Sullivan, immixGroup Inc.

07/08/2013 (11:16am)

Editor’s Note: Homeland Security Today has partnered with immixGroup Inc. to bring you exclusive market insight and analysis.

In this installment, Stephanie Sullivan, Market Intelligence Consultant, offers a look at the major White House and Congressional efforts impacting cybersecurity programs throughout the federal government, as well as some of the main contracting opportunities on the cyber horizon.


As cyber threats continue to dominate the headlines, it is important for the innovators in the government security market to understand how the legislative and executive branches are tackling cybersecurity and the potential ramifications of these efforts for industry.  

FY14 Legislation Impacts on Cyber

These are some of the several legislative directives that could impact the commercial-of-the-shelf (COTS) vendor community in FY14, and aim to encourage the adoption of cybersecurity best practices on a voluntary basis. The underlying motivation of these directives is to spur industry and government collaboration on information sharing and defending networks.

The framework proposes to allow intelligence gathering on cyber-attacks and cyber threats, as well as address network security gaps in critical components of U.S. infrastructure, including banking, utility, and transportation networks.

NIST in collaboration with GSA, DOD, and DHS released a Request for Information (RFI) last February in order to gather feedback from industry and relevant stakeholders regarding the development of the framework, and has been holding a series of workshops to identify priority elements the framework must address.

An initial draft of the framework was publicly released on July 1st with revisions expected to be made following the 3rd Cybersecurity Framework Workshop being held on July 10-12th in San Diego, and will be expanded and refined leading into the fourth workshop anticipated to be held in September. Additional framework milestones include the release of the preliminary version due in October; with a final version expected in February 2014.

Keep an eye on this – participating in stakeholder engagements and familiarizing yourself with the draft guidelines will be critical to all COTS vendors, because you need to understand how your products and solutions can enhance the framework and meet these ‘voluntary’ but critical security needs. After all, the end goal of these working groups will be to eventually bake cybersecurity standards into federal acquisitions to ensure cyber protection.

  • The Presidential Policy Directive – 21 or PPD 21 on Critical Infrastructure and Security Resilience is serving as a replacement and update to 2003 Homeland Security PPD – 7, and was also issued on February 12, 2013 as a complement to the Cybersecurity Executive Order.  PPD – 21 defines what critical infrastructure is and encourages the Federal Government to strengthen the security and resilience of its own critical infrastructure, which is outlined in the directives three strategic goals. It also defines sector-specific agencies (SSAs) for critical infrastructure segments, and mandates information sharing and cooperation between the SSAs, state & local organizations, and international partners.  

The new policy establishes “national critical infrastructure centers” in the physical and cyber space designed to promote information sharing and collaboration, as well as ordering the State Department to work with DHS on issues of international interdependencies and multi-national ownership, and growing concerns of the global economy. However, some speculate that not enough has changed from the former Presidential Directive to be truly noteworthy.

  • The Cyber Intelligence Sharing and Protection Act (CISPA) is a bill designed to encourage voluntary information sharing between private companies and the government in order to gain information surrounding incoming cyber threats. In a perfect scenario a private company, like an Amazon or Google, would identify unusual network activity that may suggest a cyber attack and alert the government, or if the government detected a threat to a private business network they would share their findings.

The bill was originally introduced into Congress last year, but privacy concerns proved to be a major roadblock, and the bill didn’t make it to the Senate floor. The bill could meet the same fate this year, even after it was passed by the House of Representatives on April 18, 2013. The NSA PRISM program has halted any movement regarding cybersecurity legislation until at least September, if not further down the road due to increased scrutiny of private information sharing.

One of the provisions of note calls for mandatory reporting requirements by defense contractors when there has been a successful cyber penetration. Additionally, the NDAA also calls for improved monitoring and alert technologies to detect and identify cybersecurity threats from both external sources and insider threats. The NDAA also contains a provision aimed at addressing longstanding concerns over elements of the Pentagon’s supply chain. The NDAA hints that statutory requirements to address this problem may be down the road. DOD is encouraged to cooperate with industry.   

FY14 Federal IT Sales Opportunities in Cyber

The federal government plans to spend about $13 billion in FY14. This reflects the fact that cybersecurity continues to be a strategic concern for federal agencies. Just as important, cybersecurity will benefit from bipartisan reluctance to curb spending in this high profile area. Fiscal constraints do exist, however, and agencies will have to be circumspect in how they earmark money. The following are a small selection of programs with significant cybersecurity requirements and large allocations for new starts. It is important to understand which programs have funding and map your solutions to these programs.

FY14 Opportunities: Civilian

Funded cybersecurity opportunities within the civilian arena can be found in almost every Executive Branch agency. Below are the top three civilian programs by Development, Modernization and Enhancement (DME) funding – money used to buy new products.

  • Department of Homeland Security (DHS) National Protection and Programs Directorate (NPPD) – The Continuous Diagnostics and Mitigation (CDM) program is the agency’s largest cybersecurity investment dedicated to continuous monitoring, diagnosis, and mitigation activities to strengthen the security posture across federal .gov domain. This investment will assist DHS in overseeing the procurement, operations and maintenance of sensors and dashboards deployed to federal agencies.
    • FY14 DME IT spend for CDM is $121.4 million
  • Department of Commerce (United States Patent and Trademark Office (USPTO)) – Network and Security Infrastructure investment describes the IT operations and services provided to the USPTO and external customers by the OCIO Enhancements and upgrades of this IT infrastructure will include firewall enhancements, antivirus software, network security, data protection and compliance too.
    • FY14 DME IT spend for NSI is $89.5 million
  • DHS (NPPD) – The National Cyber Security Division, through its National Cybersecurity Protection System (NCPS), which is operationally known as ‘Einstein’, protects the Federal civilian departments and agencies IT infrastructure from cyber threats. Potential FY14 requirements for this program could include: intrusion prevention, intrusion detection, and advanced cyber analytics.
    • FY14 DME IT spend for NCPS is $72 million

FY14 Opportunities: Defense

Generally speaking, cybersecurity opportunities within the Department of Defense can be found within major network and infrastructure programs. Below are the top three defense programs by Development, Modernization and Enhancement (DME) funding – money used to buy new products.

  • Warfighter Information Network Tactical System Increment (WIN-T): High speed, high capacity tactical communications network serving as the Army’s cornerstone tactical communications system through 2027. Developed as a secure network for video, data, and imagery linking mobile warfighters in the field with the Global Information Grid. Potential FY14 procurements include firewall enhancements, intrusion protection and detection, continuous monitoring, and encryption.
    • FY14 DME IT spend for WIN-T is $815.4 million
  • Next Generation Enterprise Network (NGEN): An enterprise network which will replace the largest intranet in the world, the Navy Marine Corps Intranet, providing secure, net-centric data and services to Navy and Marine Corps personnel. NGEN forms the foundation for the Department of Navy’s future Naval Network Environment. HP was recently awarded the contract potentially worth up to $3.5 billion. The entire gamut of information assurance requirements are at play here, specifically due to the high reliance on cloud technology that NGEN will require.
    • FY14 DME IT spend for NGEN is $195.05 million
  • Consolidated Afloat Networks Enterprise Services (CANES):  Consolidates the Navy’s multiple afloat networks into one network. CANES replaces these existing networks with new infrastructure for applications, systems, and services and will improve interoperability along the way. The RFP is currently out with an award expected this winter.
    • FY14 DME IT spend for CANES is $195.1 million


About immixGroup Inc.

Founded in 1997, immixGroup® is a fast-growing company and a recognized leader in the public sector technology marketplace. immixGroup delivers a unique combination of services for software and hardware manufacturers, their channel partners, and government agencies at the federal, state, and local levels. immixGroup is headquartered in McLean, Virginia, close to Washington, DC and near the epicenter of the government IT community.



Darpa Refocuses Hypersonics Research On Tactical Missions

By Graham Warwick

Source: Aviation Week & Space Technology

July 08, 2013


For the Pentagon’s advanced research agency, blazing a trail in hypersonics has proved problematic. Now a decade-long program to demonstrate technology for prompt global strike is being wound down, with some hard lessons learned but no flight-test successes.

In its place, the U.S. Defense Advanced Research Projects Agency (Darpa) plans to switch its focus to shorter, tactical ranges and launch a hypersonics “initiative” to include flight demonstrations of an air-breathing cruise missile and unpowered boost-glide weapon. If approved, the demos could be conducted jointly with the U.S. Air Force, which is eager to follow the success of its X-51A scramjet demonstrator with a high-speed strike weapon program.

Darpa’s original plan for its Integrated Hypersonics (IH) project was to begin with a third attempt to fly the Lockheed Martin Skunk Works-designed HTV-2 unmanned hypersonic glider, after the first two launches in 2010 and 2011 failed just minutes into their Mach 20 flights across the Pacific. This was to be followed by a more capable Hypersonic X-plane that would have pushed performance even further.

The original plan drew sharp criticism from Boeing executives, who viewed the proposed program as a thinly veiled excuse to fund a third flight of Lockheed’s dart-like HTV-2, which they consider unflyable. In laying out its revised program plan, Darpa makes no mention of any political lobbying against the HTV-2, but acknowledges a third flight would not make best use of its resources for hypersonic research.

Instead, as the Pentagon refocuses on China as a threat, Darpa is looking to work with the Air Force to demonstrate hypersonic weapons able to penetrate integrated air defenses and survive to strike targets swiftly, from a safe distance. Air-breathing and boost-glide weapons present challenges different to each other and to HTV-2, but the agency believes the lessons learned so far will prove valuable.

Key take-aways from HTV-2, says Darpa program manager Peter Erbland, include that the U.S. “has got kind of lean” in hypersonics competency as investment has declined from the heady days of the X-30 National Aero-Space Plane, and that “we have to be careful assuming our existing design paradigms are adequate” when developing a new class of hypersonic vehicles.

The HTV-2 sprung some surprises on its two failed flights, first with aerodynamics then with hot structures. Working out what happened “required us to mine all the competency in hypersonics that we have,” he says, and took a team assembled from government, the services, NASA, the Missile Defense Agency, industry and academia.

Erbland says the decision not to fly a third HTV-2 was influenced by “the substantial knowledge gained from the first two flights in the areas of greatest technical risk: the first flight in aerodynamics and flight performance; the second in the high-temperature load-bearing aeroshell.” Another factor was the technical value of a third flight relative to its cost. A third was the value of investing resources in HTV-2 versus other hypersonic demonstrations. “We’ve learned a lot; what is the value of other flights?” he asks.

While the Air Force Research Laboratory had two successes in four flights of the Mach 5, scramjet-powered Boeing X-51A, Darpa’s two HTV-2 flops followed three failures of the Mach 6, ramjet-powered Boeing HyFly missile demonstrator. But as is often the case in engineering, more is learned from failure than from success, and investigation of the HTV-2 incidents will result in more robust hypersonic design tools that increase the likelihood of future success, Erbland argues.

To ensure all lessons are absorbed, work on the HTV-2 will continue to early next summer “to capture technology lessons from the second flight, and improve design tools and methods for high-temperature composite aeroshells,” he says. Information from the post-flight investigation will be combined with additional ground testing to improve the models used to design load-bearing thermal structures—”how they heat up, the material properties, their uncertainties and variables, and how we use modeling and simulation to predict thermal stresses and responses.”

HTV-2 was intended to glide an extended distance at hypersonic speed—roughly 3,000 nm. in 20 min.—and required a slender vehicle with high lift-to-drag (L/D) ratio and a carbon-carbon structure to fly for a prolonged time at high temperatures. While Flight 1 in April 2010 failed when adverse yaw exceeded the vehicle’s control power, Flight 2 in August 2011 failed when the aeroshell began to degrade, causing aerodynamic upsets that ultimately triggered the flight-termination system.

“From the first flight it was clear our extrapolation of aero design methods was not adequate to predict behavior in flight,” says Erbland. “From the first to the second flights we redid the ground testing, and rebaselined the aero using new tools. On the second flight, the changes were completely effective, even in very adverse flight conditions.” But the modifications set up the HTV-2 for failure on the second flight.

“Changes to the trajectory made it a more severe aero-thermal environment than the first flight,” he says. “We have been able to reconstruct how it failed from the limited instrumentation, and the most probable cause is degradation of the structure. Thermal stresses led to failure.” While the vehicle retained its structural integrity, temperature gradients over small areas led to local material failures that caused the upsets.

“From the second flight, we learned a lesson on how to design refractory composites, to improve our understanding of how to model hot structures under thermal load,” says Erbland. “We learned a critical lesson about variability and uncertainty in material properties. That is why we are taking time to fund the remediation of our models to account for material and aero-thermal variability.”

HTV-2 is all that remains of the once-ambitious Falcon program (for Force Application and Launch from the Continental U.S.), started in 2003 with the goal of demonstrating technology for prompt global strike. Falcon had two elements, a hypersonic cruise vehicle (HCV) and a small launch vehicle (SLV) needed to boost the cruiser into a hypersonic glide. The SLV effort helped fund Space Exploration Technologies’ Falcon 1 booster, but the HCV went through several changes.

The original HTV-1 hypersonic test vehicle was abandoned in 2006 when the sharp-edged carbon-carbon aeroshell proved impossible to manufacture. Darpa and Lockheed proceeded with the easier-to-produce HTV-2, but then departed from the original unpowered HCV concept to propose an HTV-3X testbed, with turbojet/scramjet combined-cycle propulsion. Congress refused to fund the vehicle, dubbed Blackswift, and it was cancelled in 2008, leaving two HTV-2s as the remnants of Falcon.

Now Darpa is seeking to reinvent its hypersonics focus by moving away from the global- to the tactical-range mission. But while an air-breathing weapon can draw directly on the X-51, boost-glide over a 600-nm range is a different vehicle to the HTV-2. “To get the performance we need to look at high L/D with robust controllability. Thermal management is a different problem to HTV-2. We need robust energy management. And affordability.”

Boost-glide challenges include packaging a weapon for air and surface launch. “The mass and volume constraints are different. We had a very high fineness ratio for global strike; we will have to be very innovative to get high L/D without a high fineness ratio,” says Erbland. On the other hand, “trajectory insertion velocities are lower, and the booster problem could be more tractable. The problem with global range is that orbital launch systems with the energy needed are not designed to put a vehicle on an ideal start of glide, so we have to make them fly in ways they don’t want to,” he says.

But Darpa believes its HTV-2 experience will prove useful. “It provided critical technical knowledge to enable us to design a future boost-glide vehicle capable of prompt global strike. We made huge progress in understanding what we need to do in ground-test and flight-test to design the aerodynamics and hot structure,” Erbland says. “These are lessons we would not have learned without flight test, because of the limitations with ground test. We know going forward how to use modeling and simulation and ground test to give us more confidence that we can design a successful system.”


The State Of Broadband

Only by keeping pace with the latest in regulations, competition, and technology will companies rise above low-capacity, high-priced telecom networks.

By Jonathan Feldman, InformationWeek

July 10, 2013



We all remember the bad old days of having to load data into removable media in order to send it off to the data center. After all, it would have taken days to transmit the necessary data over slow telecom links.


Problem is, the bad old days aren’t over. Instead of shipping tapes to data centers, organizations now regularly ship entire hard drives to cloud providers. Despite tremendous advances in line speeds, it still can take a week or more to transmit very large data sets, even if your line speed is 10 Mbps. Of course, companies don’t regularly need to transfer terabytes of data over the internet, but the current level of sneakernet that’s necessary for the transfer of even a few hundred gigabytes seems a bit high for the 21st century.

The state of broadband matters to your organization. There’s been considerable consumer interest over the past several years, culminating in an FCC plan announced earlier this year to expand broadband coverage and speeds and promote competition. IT organizations can benefit by staying in touch with those regulatory issues, as well as taking advantage of new technology trends, such as wireless broadband, and partnering with alternative providers and municipal networks that buck the status quo. There are clearly risks in doing so, but taking no action almost guarantees that enterprise IT, with pockets of presence in rural and other nonurban areas, will continue to be held back by low-capacity, high-expense networks.

There are many reasons why the state of consumer broadband should matter to enterprise customers:


Problem With The Status Quo

In June, National Cable and Telecommunications Association CEO Kyle McSlarrow called America’s broadband deployment over the last 10 years “an unparalleled success story,” alluding to the rise of cable IP networks and faster and more extensive broadband in the consumer market. He’s right by some measures. Among the G7 countries, even though the U.S. is only No. 5 in broadband penetration (see chart on previous page), it’s been making headway. But when you look at average broadband prices worldwide, the U.S. doesn’t compare favorably–service in the United Kingdom, Sweden, France, Japan, Korea, Germany, and many other industrialized countries is cheaper, on average. And when you look at broadband subscribers per 100 inhabitants, the U.S. is ranked No. 22, slightly above the Organisation for Economic Co-operation and Development average but below the Scandinavian countries, Korea, Canada, France, the U.K., and others.

As with many things, where you stand depends upon where you sit. Tony Patti, CIO for S. Walter Packaging, a century-old manufacturing company in Philadelphia, says that even in the SOHO market, significant bandwidth is for sale relatively cheaply (see chart, below). “People always want more for less, but we’re at a remarkable and revolutionary time in the history of the convergence of computing and communications,” Patti says. But the two key questions are these: Are you in the provider’s service area; and if you are, does the actual speed match the advertised speed? In major markets, the answer is: probably. But talk to someone in smaller cities and rural America, and a different story emerges.

Kris Hoce, CEO of Pardee Hospital, a 200-bed facility in Hendersonville, N.C., says the hospital’s telecom lines are “stretched” today, and when the management team looks at tomorrow’s challenges, including telemedicine and telemetry, he gets even more concerned.

Until a second competitor, Morris Broadband, entered the market a year ago, the incumbent provider was Pardee’s only option. “You’ll take whatever capacity they give you, do it on their time schedule, and you’ll pay through the nose for it,” Hoce says. Since Morris Broadband’s entry, Pardee has realized a 10% to 15% reduction in telecom costs, though it can’t always get sufficient bandwidth, he says.


National Broadband Plan

The FCC’s 376-page National Broadband Plan, while a testament to the ability of federal bureaucracy to fill large amounts of paper, stands to benefit enterprise IT over the next few years in several areas, if the agency follows through.

First, the FCC says that it will be publishing market information on broadband pricing and competition. Will this be as useful as PriceWatch and eBay are in determining what you should pay? We’re not sure. But transparency itself should help: A market where all players know what everybody’s charging tends to be one where prices dip as low as possible.

Second, the FCC says it will make additional wireless spectrum available, and it will update its rules for backhaul spectrum. President Obama has thrown his weight behind this movement, directing the National Telecommunications and Information Administration–the folks behind the broadband stimulus–to help the FCC with a plan to make 500 MHz of spectrum available by the fourth quarter of this year.

It’s unclear what the licensing procedures will be, and for which portion of the additional spectrum. Our bet: some mix of unlicensed spectrum (like 2.4 GHz, a nightmare for IT departments that want to avoid interference), some fully licensed (like 800 MHz, whose paperwork can take months or years to get processed), and some “lightly licensed” (like the 3,650-MHz band that was allocated for WiMax in 2005, which requires two or more licensees in the same region to cooperate). When additional spectrum comes online, it should revitalize the market and create product innovations, which should make broadband wireless a bit less difficult for enterprises to deploy.

The FCC also plans to improve rights-of-way procedures. Power and other companies that own poles either have undocumented or onerous agreements for anyone wanting to attach to a pole or bridge. Streamlining and standardizing this process would be welcome news to telecom market entrants and user organizations that want to bypass the telecom providers. The unanswered question is, how will the FCC “encourage” rights-of-way owners to improve these procedures?

The National Broadband Plan also stipulates longer-term (within the next decade) goals, including that 100 million consumers are able to access affordable 100-Mbps actual download speeds, 50-Mbps upload–more than 10 times faster than what most U.S. consumers can now get. More interesting to enterprise IT, the plan outlines a goal of affordable access to 1-Gbps links for “anchor institutions”–hospitals, community centers, schools, and so on. As these institutions get affordable links, other large institutions, like big companies, will also get affordable high-speed links.

The FCC doesn’t always have the authority to say how these goals will be accomplished. But in the “implementation” chapter of the National Broadband Plan, it suggests who (including the FCC) should pursue them. For example, it recommends that the executive branch create a “broadband strategy council” consisting of advisers from the White House and its Office of Management and Budget, NTIA, FCC, and other agencies. The FCC also has committed to publishing an evaluation of its progress as part of its annual 706 report, named after section 706 of the Telecommunications Act of 1996. You can track 706 reports at


Emerging Competition

Simplifying and streamlining the status quo won’t be as quick as we want it to be, but the situation isn’t bleak.

True, many of the wireline highways are owned by the same folks that own the off-ramps and have a big interest in resisting competition (the likes of AT&T, Verizon, and Qwest from the telco sector and Comcast, Time-Warner, and Cablevision from cable TV). But competition is in fact emerging.


Players like Morris Broadband serve relatively small and rural areas, catering to customers the larger players simply won’t touch. CenturyLink, a larger player, serves rural customers in 33 states. PAETEC competes in 84 of the top 100 areas, known as “metropolitan service areas,” which are anything but rural. Then there are municipal broadband projects such as LUS Fiber, a fiber-to-the home network started by the utility in Lafayette, La., that offers business services (10-Mbps symmetric) starting at $65 a month.

It’s hard to get information out of the incumbents–we tried, but folks like Verizon said that they don’t see how consumer broadband is related to serving enterprise customers. But the conventional wisdom is that they won’t serve an area unless they can get 25 potential customers per mile. Smaller players will look at areas with five or 10 potential customers per mile. Bottom line: Whenever competitors enter a market, prices fall. In a striking irony, the incumbents opposed to broadband regulation have lobbied local and state authorities to prevent broadband buildouts by municipal entities.

In addition to the wireline broadband alternatives, consider that the airwaves are wide open. Wireless ISPs like Clear and mobile phone and 3G data providers like T-Mobile and Verizon Wireless are interesting, but your bandwidth and reliability may vary when attempting to use their business-class SOHO service. That said, back in the day of the bag phone, nobody would rely on a cell phone for anything that was hugely important, but that didn’t keep IT organizations from playing with them in noncritical areas.

We’re also interested by the services offered by the likes of Texas-based ERF Wireless, which is completely focused on serving businesses, mainly banking and oil companies. ERF’s model: Customers invest in their own wireless infrastructure to backhaul to ERF’s network and then pay an ongoing port fee to access a secured backbone. CEO Dean Cubley says ERF’s banking customers pay about half of what they were paying to incumbent providers and have about a three-year payback on their capital investment.

Jacobson of North Carolina not-for-profit NCREN says the group’s successful BTOP round 1 application (awarded $28.2 million) came from efforts by the state’s office of economic recovery. It’s going to trickle up to the hospitals, too. “All the medical schools in the state are on NCREN today,” he says, and “the nonprofit hospitals will be eligible to interconnect to us as well.”


Welcome Back To Sneakerville

Some caution is necessary. There will be no shortage of poorly conceived broadband initiatives. Savvy IT organizations will stay close to operations, leaving the speculation to investors and economic development types.

Moving beyond sneakernet will require more than just fatter pipes. “Civil engineers discovered some time ago that building more lanes on highways does not really relieve traffic problems,” says Mark Butler, director of product marketing with Internet services company Internap. “Relief comes when you use the available capacity in a more efficient manner.”

So as you keep track of the legislation and other craziness coming out of Washington, keep pace with technical realities, lest you invest in higher-speed lines only to find that your use case isn’t quite as you had planned. George Bonser, a network operator with mobile messaging provider Seven, cites cases of companies that install high-speed lines and then discover they can’t get anywhere near their theoretical limit because of the software in use. It’s a complicated matter that deserves your attention in the same way that keeping track of broadband competition, accessibility, and fairness does.





NIST seeks input on cybersecurity framework

Upcoming Cybersecurity Framework workshop this week aims for feedback from private sector on practices that can reduce the risk of cyber attacks


Cynthia Brumfield, CSO

July 09, 2013

Starting tomorrow, July 10th, in San Diego, the National Institute of Standards and Technology (NIST) will host the third, and perhaps most important, in a series of workshops aimed at developing a voluntary comprehensive cybersecurity framework that will apply across sixteen critical infrastructure sectors.

Mandated by an Executive Order (EO) issued by President Obama on February 12, 2013, the NIST-developed framework represents the first time the federal government has sought to prescribe a wide-ranging approach to protecting critical cyber assets, a tough task that has been characterized by Department of Homeland Security Secretary (DHS) Janet Napolitano as an “experiment.” The framework must be accomplished in preliminary form by October and finalized by February 2014.

During the San Diego workshop, NIST will for the first time delve into details of the emerging framework, which is based on two earlier workshops as well as formal comments NIST received in response to a public notice. To speed things along ahead of the workshop, NIST has issued three reference materials — a draft outline of what the framework might look like, a draft framework “core” that focuses on key organizational functions and a draft compendium that features existing references, guidelines, standards and practices.

Based on the recommendations of industry commenters, NIST has placed a large emphasis in the draft framework on reaching the very senior levels of management, including CEOs and boards of director. Top “officials are best positioned to define and express accountability and responsibility, and to combine threat and vulnerability information with the potential impact to business needs and operational capabilities” NIST states in the draft outline.

This focus on top executives has not surprisingly been praised by industry participants.

“Cybersecurity is just not a technological problem,” Jack Whitsitt, Principal Analyst of energy industry cybersecurity consortium EnergySec said. “This is a business management, business maturity problem. People build what you tell them to build, people build what you fund them to build. Unless we do a better job at the business side of cybersecurity, the problems won’t go away.”

Many cybersecurity experts say that reaching that top level of management is one of the biggest challenges to ensuring adequate cybersecurity protection of critical assets. CEOs, they say, typically engage in “cybersecurity theater,” implementing hollow programs that only pay lip service to the issues.

“The reality is that most of the CEO’s are relying on their trade organizations to ‘fix the problem’ for them,” one top cybersecurity consultant said. “And the trade organizations are one of the loudest voices in the echo chamber convincing themselves that this is all just a bunch of low-probability hype and a stepping stone to more regulation.”


Another challenge, at least so far as a federal framework is concerned, is the division of responsibilities among government agencies as spelled out in the EO and accompanying Presidential Policy Directive (PPD). For example, DHS has been assigned a number of tasks under the EO that seem to relate to the framework, such as defining what constitutes critical infrastructure.

Some asset owners have suggested that there are too many moving parts in the overall cybersecurity landscape and have noted rising tensions between NIST, an arm of the Commerce Department, and DHS.


“NIST and DHS aren’t doing a good job in deciding how this is going to work,” one expert noted.


But one senior government official overseeing the process said that many cybersecurity efforts in the EO and PPD just aren’t relevant to how the framework gets developed.


“The framework is supposed to work for the widest range of industries” and therefore it doesn’t matter how critical infrastructure gets defined, for example.


“DHS is making the decision that has no bearing on this framework,” he said, adding that it is likely that the list of critical infrastructure assets won’t be made public anyway.


Yet another challenge is the degree to which the framework process is being shaped by technology vendors and consultants, who far outnumber asset owners in the workshop meetings held to date. Although NIST wants to bake-in cybersecurity through vendor-supplied technology, thereby ensuring that even small organizations which lack resources to pay cybersecurity specialists are guaranteed basic protection, some asset owners balk at being force-fed technology that may better fit vendor agendas than their own safety. One telecom cybersecurity specialist said he wished that NIST would separate asset owners from vendors and consultants in the workshop sessions.


Despite these challenges, most of the participants in the process believe that NIST is on track and that the draft framework materials released for the July workshop meet expectations. However, the real action will take place at the workshop as NIST go into greater detail on where they’re headed with the framework.


With only about three months left to meet the October deadline, most of the key players are taking a wait-and-see attitude, hoping to gain a better sense of the situation until after the workshop in San Diego. As one telecom industry representative said “we have to see whether this whole process has the result we’re looking for, which is to improve our cybersecurity posture, and not some feel-good government exercise.”


Cynthia Brumfield, President of DCT Associates, is a veteran communications industry and technology analyst. She is currently leading a variety of research, analysis, consulting and publishing initiatives, with a particular focus on cybersecurity issues in the energy and telecom arenas.



North Dakota company specializes in aerial crop imagery


by Press • 9 July 2013

By: Jonathan Knutson


GRAND FORKS, N.D. — When David Dvorak launched Field of View in 2010, he foresaw a bright future for aerial crop imagery. Today, after working with farmers, agronomists and even a South American plantation manager, he’s more optimistic than ever.

“A few years ago, there was some behind-the-scenes interest in this,” says Dvorak, CEO of Grand Forks, N.D.-based Field of View.

Now, “I’m quietly confident there’s this perfect storm brewing where the precision agriculture market really takes off and the civil UAS (unmanned aircraft system) market takes off. They’re both on a trajectory to make that happen about the same time,” he says.

Field of View’s mission is to “bridge the gap between unmanned aircraft and precision agriculture,” according to the company’s website.

Its flagship product, GeoSnap, is an add-on device for multispectral cameras mounted on either manned or unmanned aircraft. Such cameras capture images in the red, green and near-infared bands, allowing users to visualize plant stress better than they can with most other camera systems, Dvorak says.

GeoSnap takes images captured by the multispectral camera and maps them with real-world coordinates, a process known as georeferencing. That allows users to know the aerial images’ exact location on the ground.

“It’s a very complex process. We developed a product that hopefully makes the process easier,” Dvorak says.

GeoSnap costs about $5,000 per unit, with the multispectral cameras costing about $4,000 each.

Field of View only recently began selling the add-on devices. So far, the company has sold a half-dozen, including one to NASA.

Dvorak thinks NASA will use the GeoSnap to learn more about vegetative cover on Earth, though he isn’t sure of specifics.

GeoSnap generally has drawn more interest overseas because other countries have fewer restrictions on air space, he says.

- See more at:


Hagel warns senators of 2014 budget dangers

By Amber Corrin

Jul 10, 2013

In a July 10 letter to lawmakers on the Senate Armed Services Committee, Defense Secretary Chuck Hagel warned of potentially dire threats to national security if Congress fails to reverse steep budget cuts for the 2014 fiscal year.

Hagel advised lawmakers that a potential $52 billion budget cut for fiscal 2014, which would be mandated under sequester spending caps imposed by the 2011 Budget Control Act, would continue to erode military readiness and weaken national defenses.

“I strongly oppose cuts of that magnitude because, if they remain in place for FY 2014 and beyond, the size, readiness and technological superiority of our military will be reduced, placing at much greater risk the country’s ability to meet our current national security commitments,” Hagel wrote in to Sens. Carl Levin and James Inhofe, the committee’s chairman and ranking member, respectively. “This outcome is unacceptable as it would limit the country’s options in the event of a major new national security contingency.”

The secretary warned that “draconian actions” would be necessary to meet the budget-cut requirements. His comments stem from findings in the Strategic Choices and Management Review he directed earlier this year.

Such moves could include ongoing hiring freezes and layoffs as Defense Department officials seek to avert a second year of furloughs. Cutbacks in training and readiness could continue, and investments in areas such as research and development would also decline. DOD’s sustained efforts in acquisition reform additionally would take a hit, he said.

“The department hopes to avoid a second year of furloughs of civilian personnel, but DOD will have to consider involuntary reductions in force to reduce civilian personnel costs,” Hagel wrote. “The resulting slowdown in modernization would reduce our long-term, critically important and historic technological superiority and undermine our better buying power initiatives.”

Hagel called on members of Congress to cooperate with the Pentagon, the White House and each other to help mitigate what he deemed to be serious adverse consequences. He urged congressional support for controversial measures proposed by President Barack Obama in his 2014 budget, including slowed growth in military pay raises, increased TRICARE fees and the retirement or cancelation of lower-priority weapons programs.

Hagel also asked Congress to eliminate restrictions on military drawdown timelines and firing practices to reduce poor-performing civilian personnel, and reiterated his push for another round of the Base Realignment and Closure Act.

Training and modernization remain poised to take the biggest hits in the 10 percent across-the-board cuts. Cutbacks in training programs already in place under this year’s sequestration would have to continue or be accelerated, putting troops and citizens at greater risk, Hagel wrote. New programs would be hard-hit as well.

“DOD would be forced to sharply reduce funding for procurement, [research, development, testing and evaluation] and military construction. Indeed, cuts of 15 percent to 20 percent might well be necessary,” Hagel said. “Marked cuts in investment funding, especially if they continue for several years, would slow future technology improvements and ay erode the technological superiority enjoyed by U.S. forces.”

He also warned that cuts would spill over into private industry as purchases and acquisition plans would be interrupted and costs increased.

“Defense industry jobs would be lost and, as prime contractors pull back and work to protect their internal work forces, small businesses may experience disproportionately large job losses,” Hagel wrote.


Sequestration Would Force Civilian Personnel Cuts in 2014, Hagel Says

By Eric Katz

July 11, 2013

The Defense Department is considering civilian reductions in force in fiscal 2014 to match reduced budget levels required by sequestration.

In a letter to the Senate Armed Services Committee, Defense Secretary Chuck Hagel said that while he is “fully committed” to enacting President Obama’s budget, he is currently planning a  “contingency plan” in case sequestration remains in effect.

“DoD is hoping to avoid furloughs of civilian personnel in fiscal year 2014,” Hagel wrote, “but the department might have to consider mandatory reductions in force.”

Hagel added the RIFs do not offer much in the way of immediate savings, but would help the department reach future budget caps. The Pentagon would have to slash $52 billion from its budget next year if Congress fails to strike a deal to end sequestration.

“While painful,” Hagel wrote, “RIFs would permit DoD to make targeted cuts in civilian personnel levels rather than the more across-the-board cuts associated with furloughs.”

Military personnel would fare better, as their funding cuts would be “disproportionately small” due to separation costs. If Congress moves forward with its plan to raise military pay 1.8 percent — rather than the 1 percent Obama called for — implementing sequester cuts would be even more difficult, Hagel said.

The Defense Department could severely trim military personnel, but it would require halting accessions, ending permanent-change-of-station moves, stopping discretionary bonuses and freezing promotions. As the Pentagon has repeatedly emphasized, continued cuts would also negatively affect maintenance, modernization and readiness.

“In sum,” Hagel said, “the abrupt, deep cuts caused by the [2011 Budget Control Act] caps in FY 2014 will force DoD to make non-strategic changes. If the cuts continue, the department will have to make sharp cuts with far reaching consequences, including limiting combat power, reducing readiness and undermining the national security interests of the United States.” 


What I learned from researching almost every single smart watch that has been rumored or announced


By Christopher Mims

July 11, 2013

Smart watches! I sure hope you like them, because literally everyone is developing one. And yet, given the vanishingly small proportion of watches that are “smart,” clearly, something is holding them back. Here are the trends that jumped out when I undertook a more or less comprehensive catalog of the forthcoming wrist-top wearables.

Smart watches are going to be big. As in physically large.

I hope you have man hands, because the average smart watch is going to have a 1.5″ display and look like one of those oversize G-shock watches that are favored by IT support guys and gym coaches. Some smart watches are actually just smartphones with a wrist band, and therefore truly gigantic.

Insufficient battery life is killing the smart watch dream.

This chart is old, but it illustrates a trend that continues to this day. (I asked the man who created it for an update, and he says none exists.) The bottom line: Moore’s law does not apply to batteries. That is, every year, we get more processing power per watt of electricity we put into a microprocessor, but battery technology is not proceeding at the same pace.

That’s a problem for a device that needs to be connected to a smartphone (via bluetooth) and/or a cell phone network. Those radios will kill your battery. (Incidentally, turning them off is the single best way to preserve the battery life of your smartphone.) And the color, back-lit, 1.5″ LCD display mentioned above? It’s not doing your smart watch battery any favors, either.

The result of all this are smart watches with only three to four days of battery life, and that’s likely to be reduced significantly as developers find new ways to make smart watches useful (and therefore force them to use their radios and change their displays more often).

Some manufacturers are talking about adding things like inductive (i.e. wireless) charging to their smart watches. That will add bulk, but dropping your watch on a charging pad every night might be way less annoying than remembering to plug it in alongside your smartphone.

Smart watches are going to come with a variety of intriguing display technologies not seen elsewhere.

Nothing  begets creativity like constraints, and given the battery issues outlined above, some makers of smart watches are turning, or have already resorted to, display technologies that require less power than traditional LCD displays.

Qualcomm’s rumored smart watch, for example, supposedly uses Mirasol, a kind of reflective, full-color display that requires no power unless it’s being updated. (Mirasol displays color by refracting light like a butterfly’s wings, rather than emitting actual red, green and blue light, like an LCD.) The Pebble smart watch uses an e-paper display like that found in the Kindle and many other e-readers. And the Agent smart watch, which just raised a million dollars on Kickstarter, uses a black and white “memory LCD” produced by Sharp, which unveiled the technology in 2008 and has been trying to find a suitably mass-market use ever since.

All of the non-LCD displays represent a compromise of some kind, when compared to the bright, extra-sharp LCD displays we’ve become accustomed to on our smartphones. This will make smart watches less a “second screen” than a place to push updates like Facebook alerts and text messages. If that sounds less useful than, say, a little smartphone, well that’s one more reason smart watches have yet to take off.

Smart watches could be the next netbooks—in other words, a huge flop.

Samsung, Apple, Google, Microsoft, LG, Qualcomm, Sony—they’re all pouring money into smart watches, but so far every indication is that the devices they’re working on are at best their take on the existing smart watch concept, which frankly isn’t all that compelling. But every consumer electronics manufacturer is looking for the next iPhone or tablet, anything to stop the red ink in their PC divisions.

Or smart watches could allow for the kind of unobtrusive, always-on computing that is the promise of Google Glass.


Thanks, local retailer, for letting me know I should buy this thing online.EmoPulse

The same constraints that are forcing smart watch designers to get creative with their displays are also forcing them to come up with something better for these things to do than save you the three seconds it takes to get your phone out and read a text message. For example, the wrist is a logical place to put the kind of RFID chips that allow “digital wallets”—just touch your watch to the payment pad, and you’re done. Or maybe your watch helps you not to forget your keys, wallet and anything else that’s critical, as you run out the door. Or even, maybe your smart watch makes it less likely you’ll be shot with your own gun. The possibilities are endless, and that’s probably what keeps backers coming back to smart watch projects on Kickstarter. Whether or not the mega-corporations rolling them out will find ways to answer these needs with their mass market products remains to be seen.

Demand for laptops is so weak that analysts have declared all of 2013 a “write-off”


By Christopher Mims @mims

July 10, 2013

Demand for laptop computers is so weak in the first half of 2013 that the analysts at IHS iSupply have declared it virtually impossible that the overall market for laptop and desktop PCs will grow in 2013 over 2012. It’s the same death-of-the-PC-story we’ve heard before, only now the infection has spread to laptops as well. The numbers:

  • 6.9% drop in laptop shipments between the first and second quarters of 2013. That’s twice the 3.7% drop seen in 2002 after the dot-com bust.
  • Compare that to a 41.7% increase in laptop shipments from Q1 to Q2 of 2010. Typically, the second quarter of the year sees a sharp uptick in purchases of notebook computers, a bounce-back after soft demand in the beginning of the year.
  • 2013 will be the second year in a row in which PC shipments shrank overall. Unless a miracle happens in the second half of 2013, the PC industry is going to have to face the fact that its decade of expansion, from 2001 to 2011, is over.

The culprit in all of this? “Media tablets,” says iSupply. And those are only becoming more versatile at the high end, more affordable at the low end, and more popular overall. Given those trends, could 2014 be the third year in a row that PC sales decline? It would be unprecedented, but manufacturers can’t rule it out.



Report: Use of coal to generate power rises

Miami Herald

Tribune Washington Bureau

Posted on Wed, Jul. 10, 2013

Power plants in the United States are burning coal more often to generate electricity, reversing the growing use of natural gas and threatening to increase domestic emissions of greenhouse gases after a period of decline, according to a federal report.

Coal’s share of total domestic power generation in the first four months of 2013 averaged 39.5 percent, compared with 35.4 percent during the same period last year, according to the Energy Information Administration, the analytical branch of the Energy Department.

By contrast, natural gas generation averaged about 25.8 percent this year, compared with 29.5 percent a year earlier, the agency said in its most recent “Short-Term Energy Outlook.”

With coal prices dropping and gas prices rising, the agency said it expected the use of coal to remain on the upswing, accounting for 40.1 percent of electricity generation through 2014. Natural gas would fuel about 27.3 percent.

Power plants are the single largest source of greenhouse gases that drive climate change. The growing use of coal is occurring against the backdrop of President Barack Obama’s announcement of a sweeping plan to reduce greenhouse gases, including curtailing emissions from power plants. His initiative has already sparked opposition from the coal industry, congressional Republicans and coal-state politicians.

Opponents say new regulations are unnecessary in part because utilities have relied more on natural gas, which emits less heat-trapping carbon dioxide than coal does. But the new data indicate that power plants will readily return to coal if the price of natural gas gets too high.

“Markets on their own may go in your direction for a period of time, but to ensure that we get reductions in greenhouse gas emissions in a significant, sustained way, you’re going to need government intervention,” said James Bradbury of the World Resources Institute, a Washington think tank.

The energy administration estimated that carbon dioxide emissions from fossil fuels would rise by 2.4 percent in 2013 and 0.6 percent in 2014, after falling about 3.9 percent in 2012.

“The increase in emissions over the forecast period primarily reflects the projected increase in coal use for electricity generation, especially in 2013 as it rebounds from the 2012 decline,” the report said.

In a speech last month, Obama directed the Environmental Protection Agency to propose rules by June 2014 to cut greenhouse gas emissions from power plants. A rule for new power plants is expected by September.

Coal-fired generation accounted for about 50 percent of the electricity produced in the U.S. about a decade ago. But a natural gas boom driven by hydraulic fracturing has pushed down prices, making natural gas more competitive with coal. By April of last year, coal and natural gas each produced about one-third of the country’s power.

Lower demand for coal drove down its average price, said Elias Johnson, a coal industry expert for the agency. At the same time, the price of natural gas ticked upward, buoyed by demand and somewhat reduced production.

Utilities, many of which have natural gas and coal plants, will probably toggle between the two fuels in the near term, burning the cheaper one more often.

“What is the least expensive form of generation gets dispatched first: renewables, hydro, then maybe nuclear and then coal or natural gas,” said Karen Obenshain of the Edison Electric Institute, a utility trade group in Washington.

Coal is not expected to grab a 50 percent share of power generation again because new regulations curtailing mercury emissions from power plants will probably shutter many small, older coal plants, said Mark McCullough of American Electric Power, one of the country’s largest coal-fired utilities. Even with such closures, the U.S. will probably fail to sharply reduce greenhouse gas emissions by 2020, a goal set by Obama in 2009, without a comprehensive effort to address carbon dioxide pollution.

Said Bradbury, “Electricity markets are very dynamic, and while there’s been a lot of press about the success story of the benefits of natural gas, it’s important to realize that that’s temporary and it depends on gas prices staying really low, and we’re starting to see there are these thresholds where utilities will switch back to higher-carbon fuel, like coal.”


Does Wearable Tech Have A Place In The Enterprise?

Posted by Dan Swinhoe

on July 04 2013

This week saw the first Pebble smartwatches selling online. Sony, Acer, Google, Apple, Foxconn and Samsung are all working on their own versions. The era of wearable tech is within sight.

According to Juniper research, almost 70 million smart wearable devices will be sold in 2017, and the market will be worth more than $1.5 billion by 2014. ST Liew, president of Acer’s smartphone group, told Pocket-Lint “We are looking at wearable, I think every consumer company should be looking at wearable.” While that might be true, should enterprises be doing the same?

Right now wearable tech is mostly for sporty types; heart rate monitors, fancy pedometers, HUD for skiers etc. But soon the market will be flooded with a tidal wave of smartwatches and Google Glass. And while this will no doubt affect how companies collect user data, develop apps and interact with consumers, will we be seeing workers around the office or datacenter wearing them?

Rose-Tinted Google glass?

Smartwatches probably won’t be essential to any enterprise mobility program, merely a notification tool with additional security pains to account for. But despite being banned in many places before it’s even released, Google Glass is getting plenty of people excited.

So far most of it has been on the consumer side of things. Some doubt whether it could ever be used for the enterprise, while others think it’s the best thing since sliced bread (or the Cloud at least). Chris Hazelton of 451 Research told Computerworld it would be the next step in Mobility & BYOD trends, which would eventually help drive its acceptance.

Fiberlink have jumped on board early, offering its MaaS360 platform to IT admins through the device, and said that since most EMM and MDM platforms support Android already, much of the hard work is already done. Meanwhile Dito, a company that provides services for Google applications, have promised enterprise apps for Glass  (AKA Glassware) by late 2013/early 2014. The company’s co-founder, Dan McNelis, explained at the E2 conference that one of its clients was looking at building information modelling, or BIM, applications, which could help construction workers on site check schematics and that everything was in the write place/angle.

Along with construction, Glass has been cited as a hands-free tool for utility workers while dealing with high voltage, or as a potential HUD for pilots, and possibly even real-time polling.

Though facial recognition might be banned, the core concept of early Glassware apps MedRef – brining up a person’s medical records instantly – highlights the potential wearable gear has within the healthcare industry. Whether it’s tracking nurses with RTLS (Real-Time Location Systems) or better diagnosis and delivery methods, or even live from the operating table, hospitals could be wearable tech’s first home outside the sports ground.

It’s not just glasses and watches that are entering the enterprise. A smart bracelet for workers at risk of being kidnapped has been developed, sending pre-set warnings to social media and other workers in the area, while Motorola has developed some heavy duty engineering gear more tailored towards their needs and is also customizable. A new smartring has been developed by Chinese company  Geak, which has great potential for being a very useful security/authentication tool. I can see far more of a market for specially tailored wearable tech arising once the bluster over Glass & smartwatches has died down.

So does wearable tech have a place in business, or is it just another consumer procrastination device? I think some do, especially if they’ve been custom-made for the purpose. But I doubt we’ll be seeing an office full of smart this and wearable that.The future success of the likes of Google Glass or any number of future smartwatches will depend entirely on the quality of the hardware & apps provided, and the imagination of those using them.

 I also agree with Hazelton’s view that BYOWD (Bring-Your-Own-Wearable-Device) will be an important factor.


Quinoa should be taking over the world. This is why it isn’t.

Washington Post

By Lydia DePillis, Updated: July 11, 2013


In the Andean highlands of Bolivia and Peru, the broom-like, purple-flowered goosefoot plant is spreading over the barren hillsides–further and further every spring. When it’s dried, threshed, and processed through special machines, the plant yields a golden stream of seeds called quinoa, a protein-rich foodstuff that’s been a staple of poor communities here for millennia. Now, quinoa exports have brought cash raining down on the dry land, which farmers have converted into new clothes, richer diets, and shiny vehicles.

But at the moment, the Andeans aren’t supplying enough of the ancient grain. A few thousand miles north, at a downtown Washington D.C. outlet of the fast-casual Freshii chain one recent evening, a sign delivered unpleasant news: “As a result of issues beyond Freshii’s control, Quinoa is not available.” Strong worldwide demand, the sign explained, had led to a shortage. A Freshii spokeswoman said that prices had suddenly spiked, and the company gave franchises the choice to either eat the cost or pull the ingredient while they renegotiated their contract.

Quinoa is a low-calorie, gluten-free, high-protein grain that tastes great. Its popularity has exploded in the last several years, particularly among affluent, health-conscious Americans. But the kinks that kept the grain out of Freshii that day are emblematic of the hurdles it will face to becoming a truly widespread global commodity and a major part of Americans’ diet. It shows the crucial role of global agribusiness, big-ticket infrastructure investment, and trade in bringing us the things we eat, whether we like it or not.

In short, it’s hard to keep something on the menu if you might not be able to afford it the next day. And the American agricultural economy makes it hard for a new product to reach the kind of steady prices and day-in-day-out supply that it takes to make it big.


A grain whose time has come

Quinoa went extinct in the United States long before upscale lunch places started putting it in side salads. Agronomists have found evidence of its cultivation in the Mississippi Valley dating back to the first millennium AD, but it faded away after farmers opted for higher-yielding corn, squash, and bean crops.


Enthusiasts started growing quinoa again in the 1980s, mostly in the mountains of Colorado. It’s not easy, though–sometimes it takes several seasons to get any harvest, since seeds can crack, get overtaken by weeds, or die off because of excessive heat or cold. In 2012, the U.S. accounted for a negligible amount of the 200 million pounds produced worldwide, with more than 90 percent coming from Bolivia and Peru.

Demand started to ramp up in 2007, when Customs data show that the U.S. imported 7.3 million pounds of quinoa. Costco, Trader Joe’s, and Whole Foods began carrying the seed soon after, and the U.S. bought 57.6 million pounds in 2012, with 2013 imports projected at 68 million pounds. And yet, prices are skyrocketing; they tripled between 2006 and 2011, and now hover between $4.50 and $8 per pound on the shelf.

What’s driving the increase? Part of it is that Peru itself, already the world’s biggest consumer of quinoa, patriotically started including the stuff in school lunch subsidies and maternal welfare programs. Then there’s the United Nations, which declared 2013 the International Year of Quinoa, partly in order to raise awareness of the crop beyond its traditional roots.

But it’s also about the demographics of the end-user in developed countries–the kind of people who don’t think twice about paying five bucks for a little box of something with such good-for-you buzz. A few blocks away from Freshii in Washington D.C. is the Protein Bar, a four-year-old Chicago-based chain that uses between 75 and 100 pounds of quinoa per week in its stores for salads and bowls that run from $6 to $10 each (Their slogan: “We do healthy…healthier”).

Right now, the company has so far decided to absorb the higher prices, which still aren’t as much of a cost factor as beef and chicken. It will even pay a little extra to ship the good stuff from South America, rather than the grainier variety that Canada has developed.

“As much as I don’t like it–you never want to pay more for your raw materials–it’s central to our menu,” says CEO Matt Matros. “I’m pretty positive that as the world catches on to what a great product is, the supply will go up and the price will come back down. It’ll come down to the best product for us. If we find that the American quinoa is as fluffy, then we’ll definitely make the switch.”

Cracking the quinoa code

The Andean smallholders are trying to keep up with the demand. They’ve put more and more land into quinoa in recent years; Bolivia had 400 square miles under cultivation last year, up from 240 in 2009. The arid, cool land that quinoa needs was plentiful, since little else could grow there. And thus far, that trait has made it difficult to grow elsewhere.

But that doesn’t mean the rest of the world isn’t trying. A Peruvian university has developed a variety that will grow in coastal climates. There are also promising breeding programs in Argentina, Ecuador, Denmark, Chile, and Pakistan. Washington State University has been developing varieties for cultivation in the Pacific Northwest, and in August will hold a quinoa symposium bringing together researchers from all over to talk about their work.


“To me, the imagination is the limit, and a whole lot of effort,” says Rick Jellen, chair of the plant and wildlife sciences department at Brigham Young University. “Quinoa is a plant that produces a tremendous amount of seed. So you have potential, with intensive selection, to identify variants that have unusual characteristics.”

The South American quinoa industry, and the importers who care about it, are worried about the coming worldwide explosion of their native crop. Despite a bubble of media coverage earlier this year about how strong demand is making it difficult for Bolivians to afford to eat what they grow, it’s also boosted incomes from about $35 per family per month to about $220, boosting their standards of living dramatically. Now, the worry is maintaining a steady income level when production takes off around the world.

Sergio Nunez de Arco, a native Bolivian who in 2004 helped found an import company called Andean Naturals in California, likes to show the small-scale farmers he buys from pictures of quinoa trucks in Canada to prove that the rest of the world is gaining on them, and that they need to invest in better equipment. Meanwhile, he’s trying to develop awareness about the importance of quinoa to reducing poverty, so that they can charge a fair trade price when the quinoa glut comes.

“The market has this natural tendency to commoditize things. There’s no longer a face, a place, it’s just quinoa,” de Arco says. “We’re at this inflection point where we want people to know where their quinoa is coming from, and the consumer actually is willing to pay them a little more so they do put their kids through school.”

He’s even helping a couple of Bolivian farmers who don’t speak English very well fly to that Washington State University conference, so they’ll at least be represented.

“It kind of hurts that the guys who’ve been doing this for 4,000 years aren’t even present,” de Arco says. “‘You guys are awesome, but your stuff is antiquated, so move over, a new age of quinoa is coming.’”

Why isn’t the U.S. growing more of it?

So far, though, the mystery is why the new age of quinoa is taking so long to arrive.

Americans have been aware of the crop for decades, and used to produce 37 percent of the world supply, according to former Colorado state agronomist Duane Johnson. It never took off, partly because of pressure from advocates of indigenous farmers–in the 1990s, Colorado State University researchers received a patent on a quinoa variety, but dropped it after Bolivian producers protested it would destroy their livelihoods.

You don’t need a patent to grow a crop, of course. But the switching cost is extremely high, says Cynthia Harriman of the Whole Grains Council. “Can you get a loan from your bank, when the loan officer knows nothing about quinoa? Will he or she say, ‘stick to soybeans or corn?’” It even requires different kinds of transportation equipment. “If you grow quinoa up in the high Rockies, where are the rail cars that can haul away your crop? Or the roads suitable for large trucks?”

All that infrastructure costs money, and the only farmers with lots of money are in industrial agribusiness. But U.S. industry has shown little interest in developing the ancient grain. Kellogg uses quinoa in one granola bar, and PepsiCo’s Quaker Oats owns a quinoa brand, but the biggest grain processors–Cargill and Archer Daniels Midland–say they’ve got no plans to start sourcing it. Monsanto, the world’s largest seed producer, has nothing either.

Instead, their research and development dollars are focused entirely on developing newer, more pest-resistant forms of corn, soybeans, wheat, sugar, and other staples. All of those crops have their own corporate lobbying associations, government subsidy programs, and academic departments devoted to maintaining production and consumption. Against that, a few researchers and independent farmers trying to increase quinoa supply don’t have much of a chance.

“This is something where it would truly have to come from the demand side–no one wants to get into this and get stuck with all this excess inventory,” says Marc Bellemare, an agricultural economist at Duke University. And how do you determine how much demand is enough, or whether a fad has staying power? “We still haven’t fully unbundled what the decision bundle is. It’s like shining a flashlight in a big dark room.”

That’s why it’s hard for any new crop to make the transition from niche to mainstream. Products, maybe: Soy milk is ubiquitous now, after years as a marginal hippie thing, but it comes from a plant that U.S. farmers have grown for decades. An entirely new species is something else altogether. “I wouldn’t even go so far as to say that’s a non-staple that went big-time,” Bellemare says.

For that reason, quinoa prices are likely to remain volatile for a long while yet. Brigham Young’s Rick Jellen says the lack of research funding for quinoa–relative to the other large crop programs–means that even if they come up with a more versatile strain, it won’t have the resilience to survive an infestation.

“Once that production moves down to a more benign environment, you’re going to get three or four years of very good production,” he predicts. “And then you’re going to hit a wall, you’re going to have a pest come in, and it’s going to wreak havoc on the crop. I think we’re going to see big fluctuations in quinoa prices until someone with money has the vision and is willing to take the risk to invest to really start a long-term breeding program for the crop.”

Which means that if you’re looking forward to a quinoa lunch in downtown D.C., be prepared for a disappointment.


Defcon founder’s message to feds fair to some, hypocritical to others

Dis-invitation is interesting because last year Defcon opened with General Keith Alexander, director of the National Security Agency

Jaikumar Vijayan

July 12, 2013 (Computerworld)

Defcon founder Jeff Moss’ request to government agencies asking them not to attend next month’s annual Defcon hacker conference has evoked a mixed response from the security community.

Many see it as little more than a symbolic gesture meant to convey the hacker community’s discomfort over recent revelations of government surveillance activities by fugitive document-leaker Edward Snowden.

Others though see it as somewhat hypocritical move by an organization that has for long prided itself on giving a platform for all members of the security community to exchange ideas and share information freely.

Two researchers from the network security-consulting firm Secure Ideas on Thursday announced that they would not present at Defcon as scheduled, to protest Moss’ actions.

Moss launched Defcon 21 years ago and has overseen its growth into one of the industry’s largest hacker conferences. On Wednesday, he published a blog post in which he asked government agencies to “call a time-out” from the conference.

“For over two decades Defcon has been an open nexus of hacker culture, a place where seasoned pros, hackers, academics, and feds can meet, share ideas and party on neutral territory. Our community operates in the spirit of openness, verified trust, and mutual respect,” he wrote.

“When it comes to sharing and socializing with feds, recent revelations have made many in the community uncomfortable about this relationship,” he said in asking them not to attend Defcon this year.

The dis-invitation is interesting because it was only last year that Defcon had opened with a keynote from General Keith Alexander, director of the National Security Agency, the entity at the center of the surveillance controversy.

“Jeff Moss’s post was a statement, not an order, but it was an important one,” said Michael Sutton, a vice president of security research with Zscaler.

Moss is well respected within both the black hat and white hat communities and has strong government connections in his role as an advisor to the U.S. Department of Homeland Security (DHS), Sutton noted.

“His statement illustrates the deep disappointment of the Defcon community, who feel that they were blatantly lied to in light of the PRISIM scandal,” he said referring to Alexander’s denials last year when asked at the conference if the NSA was spying on U.S. citizens.

“Jeff is standing up for the community by saying ‘you disrespected us in our own house — we’d prefer you not visit this year’,” Sutton said.

For many at Defcon, Edward Snowden’s recent revelations of widespread NSA surveillance activities are likely to have only reinforced their suspicion of all things government, said Richard Stiennon, principal at IT-Harvest.

With Defcon, there’s always been a bit of the “young generation versus the Man,” Stiennon noted. In recent years, NSA and other three-letter government agencies have been recruiting from Defcon ranks, leading to a gradual thawing in relations between the two communities, he said. Even so, members of the Defcon community have only shown a “wary willingness” to interact with government types at best.


That willingness likely has been tested by the Snowden affair, Stiennon noted. “A group of security professionals who are aligned to doing things and creating things that are protective of security and privacy and going to find themselves at odds with the NSA. So it may be best for both sides to cool off a bit,” Stiennon noted.

Lawrence Pingree, an analyst at Gartner cautioned against making too much of Moss’ statement. From a publicity standpoint, it makes a certain amount of sense to ask federal agencies not to attend Defcon, considering the sentiments that have been aroused by Snowden’s revelations, he said.

In reality, it is unlikely that Moss will want to, or will even be able to stop government security types from attending the event if they really want to, he said.

In the end Moss is just sending a gentle reminder to the government that they are likely to be less than welcome among those at Defcon considering recent revelations about PRISM, said Robert Hansen, a white hat hacker and director of product management at WhiteHat Security.

“I don’t believe that anyone who works directly with the staff at Defcon really hates feds,” said Robert Hansen, a white hat hacker and director of product management at WhiteHat Security. “What they hate are that the free and open Internet has been taken from them in some sense and that theft is embodied in some sense by the people who are tasked with fulfilling often secret laws.”

“The only issue I see with Jeff’s announcement is that a lot of the most important, die-hard, freedom advocates work in or work directly with the military industrial complex, and it’s unfair to paint them as the enemy of hackers,” Hansen noted. “Though Jeff has never said that directly, and I don’t believe he feels that way, I worry that people less familiar with the situation would mis-represent his words.”

Others though see Moss’ stance as needlessly politicizing the annual hacker fest.

In a blog post, James Jardine and Kevin Johnson, two researchers from Secure Ideas, announced they would not present at Defcon this year citing Moss’ statement about not wanting the government at the show, as the reason.

“The basis of our decision, is that we feel strongly that Defcon has always presented a neutral ground that encouraged open communication among the community, despite the industry background and diversity of motives to attend,” the blog noted. “We believe the exclusion of the ‘fed’ this year does the exact opposite at a critical time.”

Ira Winkler, president of the Information Systems Security Association, and a Computerworld columnist said that Moss was being unfair in asking the federal government not to attend Defcon.

Much of Defcon’s popularity has stemmed from the effort put into making it completely neutral venue for the information security community. By asking the government to stay away, Defcon has lost some of that neutrality, he said.


The surveillance activities revealed by Snowden, and that Moss alluded to in his statement, have all been found to be completely legitimate and vetted by all three branches of the government. So rather than try and exclude government agencies, it would have been better to use Defcon as an opportunity to get more answers on the surveillance practices, he said.

“It would be better to have a legitimate discussion on the issue,” Winkler said. “Why is it legal, why is it constitutional. Stopping a group of people from attending goes against the spirit of what Defcon is supposed to be,” he said.

Defcon has always thrived on presenting controversial security topics and has gone out of the way to make it possible for people to do so, Winkler noted.

“Why is the government being singled out when no group has been singled out and prevented from speaking,” he said.

July 6 2013




Joint Chiefs Chair: Fewer Admins Needed

Military Network Consolidation Should Help Mitigate Insider Threat

By Eric Chabrow, June 29, 2013.



A side benefit of the Department of Defense’s continuing consolidation of some 15,000 U.S. military networks will be the need for fewer systems administrators; that should make IT less vulnerable to insider threats, the chairman of the Joint Chiefs of Staff says.

Army Gen. Martin Dempsey’s remarks come as federal authorities hunt for Edward Snowden, the former systems administrator at the Defense Department’s National Security Agency who leaked classified information about NSA intelligence collection programs .

“I think systems administrators is the right place to begin to clean this up, though, because they have such ubiquitous access, and that’s how he ended up doing what he did,” Dempsey said in a June 27 speech at the Brookings Institution, a Washington think tank. “We’re got to take a much harder look at this as we become more reliant on cyber-activity.”

Air Force Lt. Col. Damien Pickart, a Pentagon spokesman, says the military has thousands of systems administrators, but he couldn’t provide a precise number.

Dempsey, in his presentation, compared the Snowden leak to the insider attacks on American troops by allied Afghani soldiers. “You can’t prevent; you can mitigate the risk,” he said.

“You can’t stop someone from breaking the law 100 percent of the time. You can certainly increase the scrutiny in terms of their background investigation. You can reduce the number of them. You can put different degrees of oversight in place.”


Building Its Own Secure, 4G Network

Dempsey said the DoD’s previously announced network consolidation effort, known as the Joint Information Environment, would increase security and help ensure the integrity of battle systems in the face of disruption. The new environment, based on secure cloud, will include a 4G wireless network that will provide network access for certified smart phones and tablets. “In fact, I have a secure mobile phone with me here today,” the general said, holding up a smart phone. “The phone would make both Batman and James Bond jealous.”


Mobile devices connected to military networks must meet stringent DoD guidelines [see DoD's Influence on Smart Phone Security].

Dempsey touched on a wide range of cybersecurity concerns during his hour-long presentation.

Rules of Cyber-Engagement: The chairman said the military has developed a draft of a playbook that describes how the United States should respond to a cyber-attack on the nation’s critical infrastructure by taking specific steps. Those include:

1. Gather information on the malicious code and the systems under attack. “Our first instinct will be to pull up the drawbridge and prevent the attack, that is to say, block or defend.”

2. Launch an active defense if the attack cannot be repulsed. Dempsey characterized that response as being a proportional effort to disable the attacking botnet.

3. If that fails, consult with other “higher-level” authorities in the government to determine what to do next. Any massive retaliation would require decisions by civilian leaders, he said.

Cyber-Attack Response: Dempsey said a cyber-attack on the United States could, conceivably, be met with a conventional military response. “There is an assumption out there … that a cyber-attack that had destructive effects would be met by a cyber-response that had destructive effects. That’s not necessarily the case. I think that what [President Obama] would insist upon, actually, is that he had the options and the freedom of movement to decide what kind of response we would employ.”

Negotiating with the Chinese: Dempsey dismissed the idea that Snowden’s disclosures of the cyber-intelligence collection programs weaken America’s moral standing in cybersecurity negotiations with the Chinese. He said all nations, including the United States and China, conduct espionage in a variety of domains, including cyber, but China has developed a particular niche of stealing intellectual property.

“Their view is there are no rules in cyber, there are no laws that they are breaking, there are no standards in behavior. So, we have asked them to meet with us … in order to establish some rules of the road so that we don’t have these friction points in our relationship.”

Future of Cyber Command: The chairman envisions a day when the U.S. Cyber Command, a sub-unified command under the U.S. Strategic Command, becomes its own command. He said the current structure works, for now, but added that passage of cyberthreat information sharing legislation before Congress could change that. “If we get the kind of information sharing we need, that could be a catalyst for changing the organization, because the span and scope of responsibility will change.”

Hack-Back Opposition: The general said he opposes private companies launching their own counter-attacks against cyber-assailants – so-called hack-back attacks [see Questioning the Legality of Hack-Back]. “We don’t want private cyber-organizations conducting operations that could be perceived as hostile acts. And, if they’re perceived as hostile acts, it could lead us into conflict.”



U.S. Looks to Blunt Corporate Espionage by Chinese Firms


BEIJING—The U.S. could be signaling stepped-up prosecution of Chinese companies accused of stealing trade secrets as it filed criminal charges against one of China’s largest wind-turbine manufacturers and two of its executives, experts said.

“Maybe five years ago, it was sexier to chase drug cases than trade-secret cases,” said Benjamin Bai, a partner at Allen & Overy in Shanghai. However, “the political climate is brewing the perfect storm in the U.S. for prosecutions to increase.”

A recent law strengthening the U.S. Economic Espionage Act will likely encourage more prosecutions, said Mr. Bai, who has represented U.S. clients on intellectual-property issues.

In December U.S. President Barack Obama signed into law an amendment that allows prosecutors to seek charges against those who steal the trade secrets of not only products but also of services. The amendment was passed after a former Goldman Sachs Group Inc. computer programmer had his conviction overturned when a U.S. court ruled that the software he stole was used only internally.

The latest case, filed Thursday, involved products. U.S. prosecutors accused Sinovel Wind Group Ltd. of stealing source code for software used to control wind turbines from American Superconductor Corp., a Massachusetts-based engineering company, and then shipping four turbines equipped with the code to customers in the U.S.

“This case is indicative that American companies and the U.S. government are fed up, and can and should pursue all available legal remedies, including criminal sanctions, to put an end to trade-secret theft,” said James Zimmerman, managing partner of law firm Sheppard Mullin Richter & Hampton LLP in Beijing and a former chairman of the American Chamber of Commerce in China. “Chinese companies need to heed the warning that U.S. industry is determined to protect its core technology.”

A spokeswoman for Sinovel said Friday that the company was still studying the charges and didn’t have a formal response to allegations. U.S. prosecutors also indicted an employee of AMSC, as the Massachusetts firm is known, and two Sinovel executives. They couldn’t be reached for comment.

A spokeswoman for China’s Ministry of Foreign Affairs said she wasn’t familiar with details of the Sinovel case. “China has been enhancing the laws for intellectual-property protection and will continue to improve laws and regulations for intellectual property and copyright,” she said.

The issue has risen in prominence in recent months as companies reported that they had been hacked by groups that appeared to have connections with the Chinese government. The Obama administration has been pressing the issue of commercial hacking with Chinese officials.


China has repeatedly said it is a victim of cyberattacks, and points to the allegations of former U.S. National Security Agency contractor Edward Snowden as proof that the U.S. hacks into Chinese computers.

In an annual survey, members of the American Chamber of Commerce in China said they were becoming increasingly concerned about intellectual-property theft in China. Thirty-four percent of respondents said intellectual-property infringements caused “material damage” to their China operations last year, up from 22% a year earlier and 9% in 2010, the survey said. One in four said they experienced a breach or theft of data or trade secrets from their China operations, the chamber said.

A U.S. commission on intellectual-property theft, headed by former Director of National Intelligence Dennis Blair and former U.S. Ambassador Jon Huntsman, issued a report in May accusing China of being responsible for as much as 80% of the intellectual-property theft against U.S. companies. The commission made several recommendations, including increasing resources for the Justice Department and the Federal Bureau of Investigation to investigate and prosecute cases of trade-secret theft.

Louis Schwartz, president of China Strategies, a consulting firm focusing on Chinese trade and investment in renewable energy, said U.S. prosecutions against China for intellectual-property theft are still rare and that he was surprised at how aggressive AMSC was in pursuing its claims against Sinovel.

“My advice for clients is to see if you can get enough compensation upfront so if you lose intellectual-property rights in China, at least you have some compensation,” he said. “The lure of the China market is so great that people tend to look the other way.”

Xiang Wang, Asia managing partner for the law firm Orrick, Herrington & Sutcliffe LLP, said the number of criminal cases his firm has handled involving corporate espionage charges leveled at Chinese companies by foreign rivals has increased by 50% over the past two years. “The number of criminal cases will certainly increase,” he said.

Although companies pay fines in civil disputes over the theft of trade secrets, the penalties often aren’t effective deterrents, said Mr. Wang, who is based in Beijing. However, someone who steals trade secrets may think twice about going to prison, he said.

A version of this article appeared July 1, 2013, on page B9 in the U.S. edition of The Wall Street Journal, with the headline: U.S. Acts Tough on Trade Secrets.


How Edward Snowden Could Derail the Global Economy

By DAVID FRANCIS, The Fiscal Times July 2, 2013

NSA leaker Edward Snowden is at it again. This time, he’s leaking to the Germans.

Over the weekend Der Spiegel magazine published a report indicating that the United States has been spying on its European allies, including Germany, France and Italy. Documents provided by Snowden indicate that NSA is collecting data on European communications and planted bugs in EU offices in New York and Washington to detect rifts in the troubled monetary alliance.

Reactions from European leaders were swift and harsh. This is especially true in Germany, where the protection of private correspondence is written into its Constitution.

President Obama shrugged off the report, saying all nations collect intelligence. But this argument isn’t likely to fly with German Chancellor Angela Merkel, a former East German who grew up living under the intrusive eye of the Stasi and Soviet surveillance. She immediately condemned the United States, while a parliament member said Snowden should be rewarded for this information with asylum in Germany.

“The monitoring of friends — this is unacceptable. It can’t be tolerated. We’re no longer in the Cold War,” Merkel said through spokesman Steffen Seibert.

But the more troubling response, both for the United States and the European Union, came from French President François Hollande. He said that talks on a bilateral U.S.-EU trade deal should be put on hold until questions about the spying were answered.

“We can only have negotiations, transactions, in all areas once we have obtained these guarantees for France, but that goes for the whole European Union, and I would say for all partners of the United States,” he said of the talk set to begin next week. For good measure, French minister of foreign trade Nicole Bricq added, “We must absolutely re-establish confidence… it will be difficult to conduct these extremely important negotiations.”

Things could get worse. In a letter asking Ecuador for asylum, Snowden – who is still stuck at the Moscow airport – said he would release more documents that he deemed to be in the public interest. Russian President Vladimir Putin has also opened the possibility of Snowden remaining there.

There’s a lot at stake. The United States wants it’s economy recovery to accelerate, while Europe is desperate for economic growth (the deal is expected to add $157 billion to the EU economy and $133 billion to the U.S. economy).

Now, low-level European diplomats could leverage NSA’s spying to win concessions as negotiations over the deal get underway. Expect France, which has called for provisions to fund French movies and art in the deal, to be especially aggressive with anti-NSA rhetoric.

One can argue about whether Snowden’s revelations have made America less safe. But it’s undisputable that he has caused an erosion of trust between partners and their citizens. This trust is essential in building international trade agreements, as popular support for trade pacts is essential.

Without these pacts, international trade dries up and hundreds of billions are removed from the global economy. Put simply, these agreements eliminate barriers to doing business. For instance, a 2010 Congressional Research report found that exports to countries that are part of the Trans-Pacific Partnership Agreement totaled $747 billion in 2008.

The mistrust also has political implications. Take Germany. Much was known about PRISM when Obama visited Berlin two weeks ago, but according to the German media, Merkel asked few questions about it. Now that the German public is outraged by Snowden’s latest disclosure and has turned on Obama, Merkel will try to harness that outrage to win election in the fall, causing a further strain on German-American relations.

“This could slow down [the EU-U.S. deal] considerably,” Joerg Wolf, editor of the Berlin-based open think tank, told The Fiscal Times. “European citizens will mistrust the U.S. even more, which then would make it more difficult for EU governments to cooperate with the US in the future.”


The U.S.-EU trade pact is not the only one at risk. The pending trade deal between the United States and Ecuador has fallen apart because of Snowden. Last week, Ecuador – a country thought to be considering offering Snowden asylum- withdrew from talks, saying they feared “blackmail” if they refused to offer up the fugitive.

Senator Chuck Schumer (D-NY) refused to back down, warning Ecuador, “Your economy will pay a very big price. We should end all foreign aid, repeal trade agreements worth billions of dollars.”

Ecuador risks losing billions in exports to the United States if the deal falls apart. And while the dissolution of the deal hurts the Ecuadorian side more, the end of a two-decade old trade deal represents a diplomatic setback for both sides.

Wolf said there is potential for the same thing to happen with the U.S.-EU deal.

“Whatever U.S. public diplomacy achieved in Germany in the last four years [since George W. Bush's presidency ended], it’s gone,”’s Wolf said. “Any positive impact President Obama’s trip to Berlin two weeks ago might have had, it’s gone.”


Snowden’s Real Job: Government Hacker

Mathew J. Schwartz    | July 01, 2013 11:42 AM


How did the apparently low-level IT-administrator-turned-whistleblower Edward Snowden, 30, manage to gain access to details of numerous top secret National Security Agency (NSA) surveillance programs?

Simple: He wasn’t actually an IT or system administrator. Intelligence officials had repeatedly suggested this while also noting that the agency employs numerous contractors to help maintain its systems. Or as Gen. Keith Alexander, the director of the NSA, told ABC News last month about post-Snowden changes at the agency: “We’re now putting in place actions that would give us the ability to track our system administrators, what they’re doing and what they’re taking, a two-man rule, we’ve changed the passwords.”

In fact, Snowden himself stated in a video that his most recent job title wasn’t that of system administrator. “My name is Ed Snowden, I’m 29 years old. I worked for Booz Allen Hamilton as an infrastructure analyst for NSA in Hawaii,” he told Guardian journalist Glenn Greenwald in a video recorded in Hong Kong and broadcast after he’d asked to be identified as the source of the leaks involving Prism and other surveillance programs. Prior to that job, Snowden said, “I’ve been a systems engineer, systems administrator, senior adviser for the Central Intelligence Agency, solutions consultant, and a telecommunications information system officer.”

Many commentators read “infrastructure analyst” as NSA-speak for a system administrator role, and many news reports of Snowden’s leaks actually labeled him as being a sysadmin. But according to intelligence officials, the infrastructure analyst role refers to a position tasked with finding new ways to hack into foreign networks, to keep the NSA abreast of the signals intelligence it’s charged with gathering.

Why hasn’t Snowden’s real role been highlighted to date? Principally because government officials haven’t wanted to highlight the online espionage equivalent of breaking and entering into other countries’ networks and telecommunications equipment, according to a Sunday report in The New York Times that cited no sources.

That revelation finally explains how 30-year-old Snowden came to possess official documents relating to some of the country’s most sensitive surveillance programs, including intercepts of online audio, emails and video (Prism), traffic analysis of cell phone calls (Mainway), Internet metadata collection (Marina), and telephone content interception (Nucleon), not to mention secret court orders authorizing the surveillance programs.

Snowden said he took a job in March as a contractor at Booz Allen Hamilton — reportedly taking a pay cut — to gain access to the documents that he’s since leaked.

“My position with Booz Allen Hamilton granted me access to lists of machines all over the world the NSA hacked,” Snowden told The South China Morning Post prior to leaving Hong Kong for Moscow last week. “That is why I accepted that position about three months ago.”

Snowden now remains in limbo — his U.S. passport has been revoked — in the transit area of Moscow’s Sheremetyevo airport.

Regardless, expect the leaks to keep on coming. Greenwald has said that Snowden leaked thousands of documents, of which at least dozens are newsworthy.

Most recently, the Guardian Sunday released documents that it said detailed NSA operations against 38 embassies and missions, which were labeled as targets. That target list included not just Middle Eastern countries, but also U.S. allies such as France, Greece, Italy, Mexico, Japan and South Korea. Detailed interception methods included bugs planted in fax machines used to transmit official cables between the European Union embassy in Washington and EU headquarters in Brussels, as well as antenna intercepts and taps in networking cables.


Why the US doesn’t use cyber-weapons to attack its enemies more often

By Kevin J. Delaney    @kevinjdelaney    June 30, 2013


The US government doesn’t like to talk about it, but it has developed an arsenal of cyber-weapons that can be used to attack adversaries. Why doesn’t it deploy computer worms and other technologies for disrupting enemies’ networks and computer-controlled physical infrastructure more often in conflicts around the world?

Mike McConnell, vice chairman at government contractor Booz Allen Hamilton and former head of the National Security Agency and US director of National Intelligence, says the US has the best capabilities in the world for cyber-attacking and “can do some pretty significant damage if we choose to.”

But the government hesitates because it’s wary of making itself an even bigger target for cyber-attacks against the US, according to McConnell. Speaking at the Aspen Ideas Festival (co-organized by Quartz’s sister publication The Atlantic), he elaborated:

“Let’s say you take an action. We depend on this stuff more than anyone else. We’re more vulnerable than anybody else in the world. If we could put a map up here of the world with the US on center and we put bandwidth on top of it, it’s a bell curve. Most of the communications in the world flow through the United States and we are the biggest user and beneficiary. So there’s a great hesitancy to use anything in a cyber context because it’s relatively easy to punch back in a pretty aggressive way.

So every discussion I’ve ever participated in is ‘You’ve got some options. Well, let’s just hold those aside and consider other options.’ But we could do major damage.”

McConnell, who’s credited with building US capabilities for fighting cyber threats, notes that the purpose of the US Cyber Command unit of the Defense Department is both offense and defense. “And the offense will get a lot of attention,” he adds.

The US reportedly has used cyber-weapons against Iran’s nuclear enrichment efforts as part of a program code-named “Olympic Games.” The Stuxnet computer worm developed by the US and Israel reportedly succeeded in disrupting the centrifuges Iran was using to purify uranium in its pursuit of nuclear weapons. (Retired US general James Cartwright is now reportedly the target of a Justice Department investigation into the leaking of information about Stuxnet to the press.)


Cyber-Attackers Constantly Hitting Gas Utilities in 2013


By Robert Lemos | Posted 2013-07-01


Unknown attackers have targeted the Internet-connected systems of natural-gas companies, using brute-force attacks to attempt to access the companies’ business and process-control networks, according to a report published last week by the Internet Control System Cyber Emergency Response Team (ICS-CERT).

The incidents, which occurred in January and February, were first reported to the ICS-CERT, a component of the U.S. Department of Homeland Security, in late February, the group stated in its quarterly public report on cyber threats. Following the initial report and a subsequent warning from the ICS-CERT, more critical infrastructure companies came forward with news of other incidents.

“The companies reporting this activity operate gas compressor stations across the Midwest and Plains states within the US, although some of the attempts reported were solely against business networks,” the report stated. “While none of the brute force attempts were successful, these incidents highlight the need for constant vigilance on the part of industry asset owners and operators.”

The last attack occurred on February 23, according to the report. Yet, while the ICS-CERT claimed that no new attacks have been detected, it’s unlikely that the attacks have stopped altogether, Tommy Stiansen, chief technology officer and co-founder of threat-intelligence firm Norse, said in an e-mail interview.

“Today all public facing IP addresses are attacked on a regular basis, but the questions are really by whom and how targeted and sophisticated are the attacks,” he said. “While there may be an element of failure to report, it may be that some of these installations are compromised but admins remain unaware due the stealthy nature of the compromise.”

Recent research published by security firm Trend Micro found that Internet-connected industrial-control systems are frequently targeted by online attackers. The company’s researchers set up fake industrial control systems, made them appear valuable and logged 39 attacks over 28 days against the spoofed systems, the company stated in its report.

While the U.S. called out China for its attacks against sensitive industries, the attacks detected by Trend Micro have come from Internet addresses in 14 different nations. IP addresses in China accounted for about a third of the attacks, while Laos and the United States came in second and third, respectively.

The experiment, which occurred in 2012, underscores that attackers are continuously probing these important systems. While the ICS-CERT reportedly informed industry members of the specific IP addresses that were involved in the attacks, creating block lists based on such quickly changing attributes does not work very well, Norse’s Stiansen said.

“The use of IP block lists described in the report often give admins a false sense of security,” Stiansen said. “Today cyber criminals can setup and launch attacks using botnets and other compromised hosts, quickly changing the IP address and obfuscating the location of the actual attackers.”


Obama: U.S. will give allies info on spying claims

Seattle Times


AP White House Correspondent

Originally published July 1, 2013 at 11:04 AM | Page modified July 1, 2013 at 3:44 PM

Facing a European uproar over more U.S. eavesdropping claims, President Barack Obama argued Monday that it’s no surprise that governments spy on each other but said the United States will provide allies with information about new reports that the National Security Agency bugged European Union offices in Washington, New York and Brussels.

The latest revelations were attributed in part to information supplied by former NSA contractor Edward Snowden. Obama on Monday also said the U.S. has held “high-level” discussions with Russians to get Snowden out of a Moscow airport and back to the United States to face criminal charges.

Obama, in a news conference with Tanzanian President Jakaya Kikwete, pushed back against objections from key allies over a report in the German news weekly Der Spiegel that the United States installed covert listening devices in EU offices. He suggested such activity by governments is not unusual.

“We should stipulate that every intelligence service – not just ours, but every European intelligence service, every Asian intelligence service, wherever there’s an intelligence service – here’s one thing that they’re going to be doing: they’re going to be trying to understand the world better and what’s going on in world capitals around the world,” he said. “If that weren’t the case, then there’d be no use for an intelligence service.

“And I guarantee you that in European capitals, there are people who are interested in, if not what I had for breakfast, at least what my talking points might be should I end up meeting with their leaders. That’s how intelligence services operate,” Obama added.

European officials from Germany, Italy, France, Luxembourg and the EU government itself say the revelations could damage negotiations on a trans-Atlantic trade treaty between the EU and the United States. Agreeing to start those talks was one of the achievements reached at meetings last month in Northern Ireland between Obama and the European members of the Group of Eight industrialized economies.


Obama said the NSA will evaluate the claims in the German publication and will then inform allies about the allegations.

At the same time, he tried to reassure allies such as German Chancellor Angela Merkel, French President Francois Hollande and British Prime Minister David Cameron that he relies on personal relationships, not spying, to determine what other leaders have on their minds.

“I’m the end user of this kind of intelligence,” he said. “And if I want to know what Chancellor Merkel is thinking, I will call Chancellor Merkel. If I want to know President Hollande is thinking on a particular issue, I’ll call President Hollande. And if I want to know what, you know, David Cameron’s thinking, I call David Cameron. Ultimately, you know, we work so closely together that there’s almost no information that’s not shared between our various countries.”

Obama’s remarks came shortly after Hollande demanded on Monday that the United States immediately stop any eavesdropping on European Union diplomats.

Obama also said law enforcement officials in the U.S. and Russia were working to find a way to get Snowden back to the United States, where he is charged with violating U.S. espionage laws. The U.S. does not have an extradition treaty with Russia. Moreover, Russia has claimed Snowden is not technically on their soil because, while he is in the transit terminal of the Moscow airport, he has not passed through immigration. The U.S. has revoked his passport.

“We are hopeful that the Russian government makes decisions based on the normal procedures regarding international travel and the normal interactions that law enforcement has,” Obama said.


Identifying the Top Threat Actors

Ex-FBI Investigator on New Targets

by Jeffrey Roman, July 1, 2013. Follow Jeffrey @ISMG_News


Cyber-attacks continue to hamper organizations, says former FBI investigator Shawn Henry. And the actors waging the attacks are targeting organizations for more than just fraud, he says.

The three main groups launching cyber-attacks, Henry says: organized crime, foreign intelligence services and terrorists. And while these groups haven’t changed much over the past 12 months, their techniques have.

“They’ve become more capable,” Henry says during an interview with Information Security Media Group [transcript below]. “They’ve become more sophisticated.”

As organizations work to improve their defenses, threat actors have reacted by becoming more flexible and adaptable, he adds.

“The reality is: The offense outpaces the defense, so they’ve been able to adapt and to overcome, even against what we would consider to be some of the most resilient defenses,” Henry says.


Threat Actors’ Targets

The financial services sector continues to be the most-targeted by organized crime, which aims to pilfer sensitive information that can quickly be monetized, Henry says.

“We’ve also seen some denial-of-service attacks against networks where there might be groups or individuals that are looking to make some type of a social or political statement,” he says (see Are DDoS Attacks Against Banks Over?).

Foreign intelligence services, on the other hand, target multiple sectors, from government to manufacturing and energy to communications, in order to extract data that can be shared with industries in their specific countries, Henry says.

The other main threat: terrorist organizations, which seek to disrupt critical infrastructure and cause harm to, in particular, the United States, he says.

Henry, who left the Federal Bureau of Investigation in March 2012, is now the president of cybersecurity firm CrowdStrike. While at the FBI, he oversaw international computer crime investigations involving DDoS attacks, bank and corporate breaches and state-sponsored intrusions. Over the course of his 24-year career, Henry worked in three FBI field offices, as well as in the bureau’s headquarters. He oversaw the posting of FBI cyber-experts in police agencies around the world, including the Netherlands, Romania, Ukraine and Estonia.


FBI Background

TRACY KITTEN: Tell us about the work that you did with the FBI.

SHAWN HENRY: I worked with the FBI for 24 years and had a number of different positions. In my last position, I was responsible for all cyber-investigations worldwide, as well as criminal investigations and critical incident response. On the cyber-side, I focused primarily on breaches into networks by criminal groups, organized crime groups, terrorist organizations and foreign intelligence services. That included exfiltration of data where the adversary was reaching into a network to pull out data that they see of value, whether it be intellectual property, research and development, corporate strategies, financial data, as well as denial-of-service attacks against networks and other types of breaches where an adversary is looking to wreak some havoc on an organization or on the victim network.

KITTEN: When did you join CrowdStrike?

HENRY: I left the FBI in March of 2012 and joined CrowdStrike the following week, so April 2012. I’ve been there about 13 months now.


Evolving Cybersecurity Landscape

KITTEN: How would you say the cybersecurity landscape has evolved or changed in the last 12 months?

HENRY: It’s an interesting question. I don’t think the landscape has actually changed much. I think the exact same threats that were here when I left the Bureau are still here. What has changed is the awareness of the private sector. There’s a lot more that’s occurred here in the media that has gone out publicly. People have become aware of the threats. Organizations have begun to recognize the impact that they face and the real damage that can be inflicted, and that’s not been out publicly for many years in my service in the government. That really is the most significant change, the awareness. But the activities have not significantly changed.


Top 3 Threats Facing Banks

KITTEN: What would you say are the top-three threats banking institutions face?

HENRY: The financial services sector is probably, in my opinion, among the best protected sectors regarding networks. What we see primarily facing the financial services sector is theft of PII, personally identifiable information; primarily organized crime groups who are targeting networks, trying to steal data which they can very quickly monetize. They do that regularly. They target not only the corporate networks, the financial networks, but individuals as well, trying to capture credentials, usernames and passwords so that they can access accounts. We’ve also seen some denial-of-service attacks against networks where there might be groups or individuals that are looking to make some type of a social or political statement. They recognize that western society and the United States as a whole relies substantially on the financial services sector, so it really is seen as a target of the west, as a symbol of the west and the prosperous United States of America. It’s oftentimes a target of those types of groups.


Threats to Other Sectors

KITTEN: What about other sectors, including government?

HENRY: They’re similar; they’re not the same as what the financial services sector faces. As it relates to government and other sectors, there are oftentimes foreign intelligence services that are looking to pilfer data which they can then share with their industries in their countries, so that they have some type of a competitive advantage. Certainly, the financial services sector is not immune to that. They do get breached by foreign intelligence services that are looking for financial strategies, are interested in mergers and acquisitions, and are interested in partnership deals the financial services sector might be facilitating or enabling. But they’re not the primary threat. It’s those organized crime groups. The foreign intelligence services are hitting every sector in the country: government, military, defense contractors, manufacturing, energy and communications. It really cuts across all sectors. The other group that’s a significant threat is terrorist organizations that seek to potentially disrupt critical infrastructure and to cause harm to the United States.


Threat Actors

KITTEN: How have the actors who are waging some of these attacks changed in recent years?

HENRY: I don’t know that they’ve changed drastically. The same types of groups that I put into three different buckets – organized crime, foreign intelligence services and terrorists – remain primarily the same. We’ve seen these hacktivist groups which I would really kind of put in the terrorist bucket, but the groups themselves haven’t changed. Their capabilities have changed. They’ve become more capable; they’ve become more sophisticated. They’ve had to become flexible and they’ve had to adapt their capabilities as defenses have gotten better and as organizations have become more aware or more resilient in their defense. But the reality is the offense outpaces the defense, so they’ve been able to adapt and to overcome even what we would consider to be some of the most resilient defenses.


Nation-State Attacks

KITTEN: Would it be fair to say that attacks backed by nation-state are posing greater worries today?

HENRY: Again, I think it really depends. It depends on who you are. If you’re a defense contractor that’s developing certain military capabilities for the next-generation war fighter, nation-state is a significant worry to you. If you’re a financial organization, you’re probably more concerned about the organized crime group because that’s the thing that might impact your bottom line. If there’s a $10 million loss, it’s going to be on the balance sheet. People are going to see it, maybe the media. It’s going to pose a risk to the organization’s reputation. That’s going to pose a risk to their operations and customer confidence sometimes. It really depends who you are what the greatest worry should be to you.

Some have said that nation-states, because they’re so closely tied to the U.S. economy, that they would not necessarily take destructive actions, where a crime ring might attack a company and pose some type of a threat to data, to destroying data, in exchange for some type of a monetary reward. We’ve actually seen extortions where companies have had their networks breached, and they’ve then contacted the company and said, “We’ll be happy to turn your data over for a $150,000 consulting fee. And if you don’t, then we’re going to destroy your data.” It really depends on who the organization is [regarding] what their greatest threat to them is.

KITTEN: Would you say that the lines that divide these groups are blurring?

HENRY: We have seen some overlap. It’s not always clear. It used to be clearer in the past, but I think that the capabilities of some of these organized crime groups are such that they actually approach the maturation level of foreign intelligence services. They’re very, very capable; they’re not just kids [moving] around on the network. These are organized, methodical and well-practiced, so I do think that there’s a bit of a blur. It’s also not always clear that the groups and some of the individuals in the groups aren’t crossing lines perhaps, working for the government on one hand and then perhaps on the weekends moonlighting and doing some work for themselves.

International Investigations

KITTEN: Do you see international investigations improving?

HENRY: Every cyber-investigation for the most part has some international nexus. There’s something that either originates or ends internationally or transcends an international point. From an information-sharing perspective, it requires good coordination and good lines of communication, both in the private sector and in the government sector. In terms of identifying who the adversaries are, there needs to be what I call actionable intelligence sharing, where government-to-government they’ve got to share indicators that will help to identify who the adversary is and if they can arrest them or take some type of action to thwart the attack.

From the private sector perspective, there needs to be sharing even within the same companies that have international capabilities across many countries. They need to share information, actionable intelligence, so that they can better defend themselves and provide a better defense. I do think it’s improving. Again, the awareness piece is really important, the fact that more organizations have visibility into these types of attacks and they have a sense of understanding about what the impact is. That encourages better sharing of actionable intelligence.


Information Sharing Challenges

KITTEN: What are some of the challenges facing information sharing?

HENRY: There are a couple things. One, I think the sharing between the government and the private sector is still not as robust as it needs to be. There are a lot of reasons for that. One is the lines are not clearly drawn of exactly what companies need and what governments need. There’s a national security perspective sometimes, so it’s difficult to share classified information. That’s a bit of a challenge. Companies still have some concerns that by sharing information they’ll be revealing unnecessarily to the public that there’s a problem on their network, causing some type of a lost confidence potentially with their client base. All in all, I think it is getter better. There’s still a long way to go, but it’s certainly one of the bigger challenges.

The last piece I’ll add is the concern people have about privacy. What are we sharing? I would argue that there’s no need to share content. You don’t need to share Word files; you don’t need to share content of e-mails; you don’t need to share spreadsheet information. What you need to share is a lot of the technical data, what we call indicators if there’s been a compromise, the signatures of malware and the types of information that would help to identify how an attack occurred and who might have launched an attack, but does not compromise the actual content of data. But that privacy piece is rightfully so a concern by many people, both public and private, and it does cause some consternation when you talk about information sharing because people just need to be educated about what that information really is.


DDoS Attacks

KITTEN: What can you tell us about what’s likely going on behind the scenes?

HENRY: Let me first say that I do not have any inside information about what’s occurring. Any of my comments relate specifically to my observations from the outside looking in. But I think that there’s certainly a cause for concern any time there’s some disruption of service; or when people have difficulty accessing their networks, there’s a cause for concern. I would think that the FBI would be looking quite closely to try to determine where the attacks are coming from. I think that they would likely be working with international partners, both in the intelligence community as well as in the law enforcement community, to try and identify what the sources of attacks are. Once you can determine where those attacks are coming from, you can take law enforcement action to disrupt the networks that are launching those attacks and actually disrupt the people that are causing those attacks. Through the execution of search warrants or arrest warrants and the like, that’s going to be an action that’s going to help mitigate the threat, by actually taking the bad actors off of the playing field.


New defense cyber policies are in the works


By Amber Corrin

Jul 02, 2013


DOD’s cyber strategy from 2011 is due for an update.


The Defense Department is working on a new plan for operating in cyberspace, policy that would update the official strategy first released two years ago this month.

Army Maj. Gen. John Davis said that while significant progress has been made in the military’s cyber domain so far, there remains work to be done and that two years in real time could equal 20 years in cyberspace’s boundless landscape.

“Senior leaders in the department and beyond the department understand that cyber is a problem [and] cyber is important,” Davis said at an industry event in Baltimore on June 28. “They’ve made cyber a priority, and there is a sense of urgency.”

In July 2011, then-Deputy Defense Secretary William Lynn rolled out the defense strategy for operating in cyberspace, which encompasses five strategic pillars. They included the establishment of cyberspace as an operational domain and plans to ready forces to go with it; the introduction and employment of new operational concepts on DOD networks; cross-agency partnerships, including with the Homeland Security Department; the build-up of cyber defenses, coordinated with allies and international partners; and the capitalization of U.S. resources, including technology, workforce and rapid innovation.

Davis highlighted progress over the past two years, including the launch of service cyber components operating under U.S. Cyber Command, joint cyber centers at the combatant commands, military orders for cyber operations, an interim command and control framework, cyber force structure and training exercises.

As a result, Davis said, the U.S. military has been able to refine its role in cyberspace, as well as the partnerships that support it. That role is shared with agencies like DHS and Justice Department, he added, as well as the private sector and global collaborators.

“We have three main cyber missions, and three kinds of cyber forces will operate around the clock to conduct those missions,” Davis said.

He noted that national forces will act to counter cyber attacks on the U.S., while a second, larger group of combat mission forces will help execute military operations and integrate digital capabilities into the DOD arsenal. The largest set of cyber operators will operate and defend critical Pentagon networks.

“We will deter, disrupt and deny adversary cyberspace operations that threaten vital U.S. interests when approved by the president and directed by the secretary of defense,” he said. “If a crippling cyber attack is launched against our nation, the Department of Defense must be ready for an order from the commander in chief to act.”


Can DIA become the Defense Innovation Agency?

By Amber Corrin

Jul 01, 2013



DIA Director Lt. Gen. Michael Flynn, shown here speaking at a meeting in 2012, says DIA has to ‘break down legacy walls’ to emerge as an innovation leader. (Army photo)

“Innovation” is pretty big as far as government buzzwords go. Often found hand-in-hand with “doing more with less,” the idea of innovation is one that seems to excite leaders and frequently surfaces in briefings and talking points.

But what is actually getting done? Who is really out there taking innovation for a spin?

The intelligence community might not immediately seem like the leader, but if the Defense Intelligence Agency has anything to say about it, it is in the driver’s seat.

“The agency was established to understand the plans of the adversary. In the process of that, we became an organization that focuses on planning,” said Dan Doney, DIA’s chief innovation officer. “Planning and sticking to the plan are the opposite of innovation. Innovation is deviating from the plan, rapidly moving it and changing. When it comes to innovation, we haven’t had a great reputation. Put that in the past.”

Today, DIA is moving forward with what officials call an entirely new approach — one that eradicates traditional silos, aggressively seeks outside input and quickly implements new ideas. The objective is to avoid strategic surprise and push forward with a new way of conducting the government’s intelligence business after more than a decade of war.

“We have to break down legacy walls,” DIA Director Lt. Gen. Michael Flynn said at the agency’s June 27 Innovation Day at its Washington headquarters. “Innovation is not just about technology — it’s about a lot of things. Pay close attention to how this organization has adjusted and what we’re working toward.

Training, education and professional development are a big deal. We can’t lose the gains from the last decade.”

The biggest area of focus in DIA’s new approach is information sharing. The year-old Intelligence Community IT Enterprise (ICITE) is designed to link the 17 intelligence agencies and significantly reduce IT costs.

Among DIA’s contributions to ICITE is the Innovation Gateway, an online information-sharing environment that forges easier connections between the intelligence community and industry. The goal is to help the agencies identify and adopt better solutions, particularly those that might not have surfaced without such a system. “No more middle man,” Doney said. “No more cloudiness, no more special relationships where you need to know the right person to know what we’re looking for.”


Reaching out to industry

Users access the gateway via DIA-granted public-key infrastructure credentials. Much of the environment uses a wiki-style format, which is fitting for a system designed to free people from the constraints of standardization.


“There are some tricks associated [with] making a coherent environment that isn’t coherently planned, but the first place you show up to…is a wiki, where you’re made aware of the capabilities of the space,” Doney said. “Underneath that is a description for how a technology provider participates in the space…and they’re able to offer their technologies within the space, they’re able to link to a set of core services” for others to access.

To protect against potential theft of intellectual property, the system has safeguards in place in the form of what Doney called stakeholder management authorization.

“One of the advantages is that providers host their own [areas], so they maintain complete control over their own capabilities, technologies and configurations,” Doney said. “They’re not constrained by other people’s ways of doing things.”

The gateway is still in its early stages, but officials expect initial operating capability in December. Reaching critical mass in terms of users will be the key to the project’s success and progress, Doney said. He likened getting there to the early days of Wikipedia, when the popular online resource relied on passionate early adopters to generate content.

“The first phase is the hardest,” he said. “It takes action and energy to get over the initial barriers.”


Measuring results

Another tricky aspect of the Innovation Gateway is measuring success, particularly by traditional government models, officials said.

“There’s not going to be one way to look at it and either you passed or failed. It’s going to be all-encompassing,” said Janice Glover Jones, DIA’s CIO. She added that productivity is one key marker, but success writ large will be a little tougher to define. “And even if you fail, you get lessons learned.” Doney, too, indicated that the focus should be on the less tangible goals.

“Innovation is a means, not an end,” he said. “We’re driven by mission, so the bottom line is how much better are we able to execute on our mission, not whether or not we’re the best innovators in the world. It’s a mistake many folks make to measure the means instead of the ends.”

Still, he pointed to his “stretch goals” as crucial milestones, such as the 30-day transition requirement that reforms discovery, evaluation, integration and acquisition. Other goals are characterized by the existence of cross-functional teams of analysts, collectors and technologists who work together seamlessly; rapid access to tools and capabilities; and the availability of mission-relevant data.

The goals illustrate the agency’s wider drive to get back on track after the wars in Iraq and Afghanistan, which drastically changed the business of government.

“We’ve exercised the wrong muscles. We have to go back to a place where we exercise the right muscles and think about how we conduct business,” Flynn said. “We have to allow for failure…. It’s a risk, but that’s OK. The return on investment on risk is so good.”


NIST Unveils Draft of Cybersecurity Framework

Executives Given Key Role in Voluntary Framework

By Eric Chabrow, July 3, 2013


The cybersecurity framework, ordered by President Obama, will emphasize the importance of senior executives in managing programs to secure their enterprises’ information systems and assets, according to a draft of the cybersecurity framework released by the National Institute of Standards and Technology.

“By using this framework, these senior executives can manage cybersecurity risks within their enterprise’s broader risks and business plans and operations,” says the draft dated July 1, but made public a day later.

In February, Obama issued an executive order directing NIST, working with the private sector, to develop a framework to reduce cybersecurity risks that the mostly private operators of the nation’s critical infrastructure could adopt voluntarily [see Obama Issues Cybersecurity Executive Order].

NIST concedes much more work must be done by the time the final version of the framework is issued next February. Among the areas NIST identifies that need to be addressed in the framework are privacy and civil liberties standards, guidelines and practices as well as helpful metrics for organizations to determine their cybersecurity effectiveness.

“We want to provide something that has flexibility, that can be implemented by different sectors,” Donna Dodson, chief of NIST’s computer security division, said in an interview with Information Security Media Group prior to the draft’s release [see Fulfilling the President’s Cybersecurity Executive Order]. “We want it to be specific in other ways so that we are sure we are working to reducing cybersecurity risks in the critical infrastructure.”


5 Core Cybersecurity Functions

The framework, according to the draft, will revolve around a core structure that includes five major cybersecurity functions, each with its own categories, subcategories and information references. The five functions include Know, Prevent, Detect, Respond and Recover.

The Know function, for instance, would include a category entitled “know the enterprise risk architecture” with subcategories of “understand corporate risk tolerance” and “identify risk assessment methodologies,” as well as others. An information reference, in this instance, would link to guidance such as NIST Special Publication 800-53: Security and Privacy Controls for Federal Information Systems and Organizations and ISO 31000: Risk Management.

The framework also will include three implementation levels that reflect organizational maturity in addressing cybersecurity. Incorporated into the framework will be a user’s guide to help organizations understand how to apply it as well as a compendium of informative references, existing standards, guidelines and practices to assist with specific implementation.


Framework as a Guide, Not Detailed Manual

NIST says the framework should not be seen as a detailed manual, but as a guide to help executives, managers and staff to understand and assess the cybersecurity capabilities, readiness and risks their organizations face, as well as identify areas of strength and weakness and aspects of cybersecurity on which they should productively focus.

Some 240 entities including major technology and security vendors, trade groups, local and state governments, not-for-profit organizations and individuals this past spring submitted to NIST their ideas on IT security best practices to incorporate into the framework. NIST held a workshop in late May in Pittsburgh, where it reviewed the submissions and started to create the framework. Another workshop is scheduled for July 10-12 in San Diego, where the framework will be refined.

“Many comments advised that the cybersecurity framework would not be effective unless the very senior levels of management of an organization were fully engaged and aware of the vulnerabilities and risks posed by cybersecurity threats and committed to integrating cybersecurity risks into the enterprise’s larger risk management approach,” according to the draft.

“Time and again, comments reflected that these senior executives, including boards of directors, need to integrate and relate cybersecurity concerns and risks to critical infrastructure to the organization’s basic business and its ability to deliver products and services,” the draft says. “It is clear that these officials are best positioned to define and express accountability and responsibility, and to combine threat and vulnerability information with the potential impact to business needs and operational capabilities.”



Funding is up in the air for FAA control tower upgrade

The NextGen GPS air control plan, which would cut airline delays and fuel consumption, costs $40 billion that sequestration puts at risk.

LA Times

By Hugo Martín

7:53 PM PDT, June 30, 2013


Air travel in the future will be faster, cleaner and less expensive if the Federal Aviation Administration’s $40-billion overhaul of the nation’s air control system is completed.

That’s a big if.

With the federal sequestration fight in Washington, FAA officials say funding for the seven-year project could be in jeopardy.

The plan, known as NextGen, replaces outdated radar-based technology with global positioning systems and digital communications to modernize the country’s air control system.

By allowing pilots to fly more direct routes and giving air traffic controllers more accurate and up-to-date information, the system is expected to cut airline delays 41% by 2020, compared with the delays without NextGen, according to a new report by the FAA.

The efficiencies in the system are also forecasted to save 1.6 billion gallons of fuel and cut 16 million metric tons of carbon dioxide emissions, with $38 billion in cumulative benefits to airlines, the public and the FAA, the report said.

A key component of the system is that air traffic controllers using GPS will get more precise information on planes’ locations and speeds, allowing controllers to better manage the 7,000 or so planes in the air at any given time, according to the FAA. Because the current radar system is slower and less precise, controllers must add a bigger safety cushion of separation between planes.

In a recent speech, FAA Administrator Michael Huerta slammed lawmakers for failing to reach an agreement on future spending plans.

“Because of the financial uncertainty, we can hope for the best, but we have to plan for the worst,” he said. “This is not a sustainable course of action, and it’s no way to run a government.”


Good news for Internet junkies who fly out of Los Angeles International Airport: You’ve got plenty of flights equipped with wireless Internet to choose from.

Fliers departing from LAX to San Francisco International Airport or John F. Kennedy International Airport have the greatest number of daily domestic flight offering Wi-Fi, according to a new study by the travel site

Between LAX and San Francisco, fliers can choose from 31 daily flights that offer Wi-Fi, the study found. Between LAX and JFK, air travelers can pick 27 daily flights with Wi-Fi service, according to the study.

“Wi-Fi is not going away,” said John Walton, director of data for “Passengers want it and will pay for it.”

Overall, 38% of the domestic flights in the U.S. offer Wi-Fi, a number that has been growing 5% to 7% a year, he said. But the rate will likely surge as more airlines install satellite-based Wi-Fi on more of their fleet, Walton said.

“In the next quarter, I imagine we will see a situation where satellite Wi-Fi should be rolling out in big numbers,” he said.

• Airlines in the U.S. lose or damage about 140,000 bags a month, or about three for every 100,000 passengers.

It’s a relatively small loss rate. Still, several companies hope to capitalize on the frustration travelers feel waiting by the baggage carousel only to realize their luggage didn’t make it onto their flight.

Next week, Los Angeles company GlobaTrac plans to begin shipping a palm-sized device that travelers can toss into their bags to track luggage via the Internet or a smartphone app. The device, called Trakdot, sells for $50, plus fees from

Meanwhile, European aerospace company Airbus announced last month that it is producing luggage with built-in technology that allows passengers to track their bags. The luggage even includes a built-in scale to tell whether it is exceeding the maximum weight limits.

No word on the cost of the high-tech suitcase, but Airbus is reportedly considering letting airlines rent the bags to passengers.


Another potential roadblock for UAS integration in the USA


by Gary Mortimer • 1 July 2013


As I watch the UA story in America unfold from afar the more it seems to be like a game show. Phil from The Amazing Race has just thrown up another potential roadblock for the teams.

Last Thursday the Senate Appropriations Committee approved the 2014 transportation funding bill which asks that the FAA is to stop the rule making process until privacy concerns have been addressed.

This process is expected to take at least one year. It first has to pass through the Senate and House.

I have said it before if you want to start in the UA industry and you are from the USA probably best to move overseas. Your ideas will not be subject to ITAR and a viable commercial market exists.

Lets hope test site selection continues in parallel with this process along with all the other bits and pieces other countries seem to have managed more than 5 years ago.

Look out 2047 here we come. After the break scenes from next weeks episode.

From Senate Appropriations Committee:

The development of unmanned aerial systems [UAS] offers benefits in a wide variety of applications, including law enforcement and border patrol, precision agriculture, wildfire mapping, weather monitoring, oil and gas exploration, disaster management, and aerial imaging. The UAS industry also presents an opportunity for substantial domestic job growth.

The FAA is taking important steps toward integrating UAS into the national airspace, including implementing a UAS test site program to help the agency gather critical safety data. The expanded use of UAS also presents the FAA with significant challenges. The Committee is concerned that, without adequate safeguards, expanded use of UAS by both governmental and nongovernmental entities will pose risks to individuals’ privacy.

The FAA has recognized the importance of addressing privacy concerns by requiring that UAS test sites have privacy policies in place before test flights begin. However, as the FAA looks to integrate UAS into the national airspace, a more comprehensive approach to privacy may be warranted. The United States Constitution, Federal, and various State privacy laws apply to the operation of UAS, but in consideration of the rapid advancement of technology in this area, the Committee questions whether current laws offer sufficient protections to adequately protect individuals.

FAA’s oversight and regulatory authority over the national airspace places the agency in a position to work with other agencies on addressing privacy concerns. To that end, the Committee directs the FAA to collaborate with other Federal agencies in evaluating the impact that broader use of UAS in the national airspace could have on individual privacy.

Furthermore, the Committee includes bill language that prohibits the FAA from issuing final regulations on the integration of UAS into the national airspace until the Secretary submits a report detailing the results of such collaboration. The Committee expects this report to address the application of existing privacy law to governmental and non-governmental entities; identify gaps in existing law, especially with regard to the use and retention of personally identifiable information by both governmental and non-governmental entities; and recommend next steps in how the FAA or other Federal agencies can address the impact of widespread use of UAS on individual privacy. The Committee directs the FAA to submit this report to the House and Senate Committees on Appropriations not later than 1 year after enactment of this act.


Some states may lose Air National Guard flying units

By Brian Everstine

Staff writer

Jun. 26, 2013 – 06:00AM |

Air Force


F-16 Fighting Falcons from the Colorado Air National Guard arrive at a training base in northern Jordan as part of an exercise June 6. Budget realities may force the Air National Guard to shut down some aircraft squadrons.F-16 Fighting Falcons from the Colorado Air National Guard arrive at a training base in northern Jordan as part of an exercise June 6. Budget realities may force the Air National Guard to shut down some aircraft squadrons.

Budget realities could force the Air National Guard below its guiding principle of one flying unit per state, and state leaders could be OK with that.

During last year’s budget deliberations, the Guard and state leaders drew a line on cuts, focusing on its “capstone principle” of one unit capable of flying missions per state, while the Air Force was targeting the component for cuts.

But the realities of sequestration and an uncertain budget future may mean the closure of some aircraft units, although state leaders say they would agree to the cuts as long as there are enough assets available regionally to respond to natural disasters.
“If they don’t have (the assets), they are interested in making sure that at least regionally, they can access it very quickly,” said Heather Hogsett, the director of the homeland security committee at the National Governors Association.

Currently, all states have an aircraft unit, although there has been a push at increasing the number of Air National Guard units with cyber missions such as intelligence, surveillance and reconnaissance, which could replace flying units.

The idea came up during the third public meeting of the National Commission on the Structure of the Air Force on June 26. The congressionally mandated commission, created in last year’s defense spending bill, is made up of members nominated by the president and Congress. It will produce a report next year on the future force structure needs of the service.

Members of the commission spoke critically of the need for each state to have a flying unit. Les Brownlee, a former acting undersecretary of the Army, asked state leaders directly: “Why does a governor need an F-16?”

Maj. Gen. Tim Orr, the adjutant general of Iowa, said the 132nd Fighter Wing in his state was one of two units that lost their F-16s under last year’s spending bill. He said other parts of the wing — including medical units, maintenance, security forces, other support units — were crucial to his state, along with having pilots and their F-16s able to deploy for the Air Force.

“It’s the other capabilities in the wing that are crucial,” he said. “In (2008) floods, we used all personnel to accomplish the mission on the ground.”

Brownlee responded by saying the state has Army National Guard units that would be more effective on the ground in disasters than fighter pilots.

“We had so much flooding that it took the whole Air National Guard and Army National Guard,” Orr said. “That’s not uncommon.”

He said the need for fighters in the Guard is because “we’re the reserve of the Air Force. We have to have that same capability and capacity.”

State leaders are currently meeting with officials from the Federal Emergency Management Agency to look at the top five possible catastrophes that could occur, and then see what assets would need to be available to respond, Orr said. The report is expected this fall and could help drive debates on the allocation of Air National Guard resources.

Brownlee also highlighted the recent decision to have Air Force Reserve crews work under the Air National Guard in response to homeland disasters and under the direction of a Guard dual status commander. This overlap in abilities begs the question, “Is it time to think of a hybrid of the Reserve and Guard?” he said.

Leaders on Capitol Hill and in the Pentagon should start having that discussion and begin a national debate on if that is possible and the right thing to do, Orr said.


Wi-Fi That Sees Through Walls

Thomas Claburn


June 29, 2013


Researchers at MIT’s Computer Science and Artificial Intelligence Laboratory have come up with a way to create a low-power, portable device that can see through walls using Wi-Fi signals.

Technology of this sort, similar in concept to radar or sonar, has existed for years and relies on radio waves and other parts of the electromagnetic spectrum. It is used mainly in law enforcement and military applications, ideally when the law allows.

Wall penetration systems have become common enough that the U.S. Department of Justice last year funded a market survey of what’s known as “through-the-wall sensors,” or TTWS.

Security products maker Camero-Tech, for example, offers its Xaver line of through-wall imaging devices for defense and law enforcement applications. But with prices at about $9,000 for the handheld Xaver 100 and $47,500 for the 7 lb. Xaver 400, these aren’t consumer products.

The legality of TTWS technology is sufficiently unclear that ManTech Advanced Systems International, the company that prepared the market survey, recommends those planning to use TTWS equipment seek legal advice in advance.

In 2001, the U.S. Supreme Court ruled that the use of thermal imaging to monitor what’s going on inside a private home violates Fourth Amendment protection against unreasonable searches. But as the ability to see through walls reaches the civilian market, this legal boundary is likely to be tested again.

There is at least one consumer TTWS device on the market already, STI’s Rex Plus, an $80+ device that can be placed against a wall/door in order to sound an alarm when someone approaches the opposite side of the wall/door.

Dina Katabi, a professor in MIT’s Department of Electrical Engineering and Computer Science, and graduate student Fadel Adib propose wider civilian use of the technology through a simple, affordable device like a mobile phone, equipped with two antennas and a receiver.

In an email, Katabi suggested the technology, which she calls WiVi, can be used for virtual reality and gaming, without requiring the user to remain in a specific area in front of a sensor. She also says the technology could be used for personal safety.

“For example, if I am walking at night in an isolated area and suspect that someone is following me, hiding behind a fence or around a corner, I can then use WiVi to detect that person and alert myself to the person’s movement,” she said.

Katabi says WiVi can be used for “privacy-preserving monitoring,” such as tracking the movements of elderly people or children without actually having them on camera.

In time, however, improvements in Wi-Fi-based sensing may require a reexamination of the privacy implications of making walls effectively transparent.

“Today the technology does not show body parts or the face of the person,” said Katabi. “Hence it is naturally anonymized. However, as we improve the technology it will start giving higher resolution images for things that one cannot see because they are behind a wall. This will raise privacy related questions. As a society, we still have time to look at these issues and ensure our society has the right policies by the time such high-resolution version of the technology becomes available.”

That future already has been contemplated: University of Tokyo researchers have developed paint that blocks Wi-Fi signals.

Wi-Fi-based sensing appears to be a particularly active area of research at the moment. At the University of Washington, researchers have developed a related technology, WiSee, a Wi-Fi-based gesture sensing system that duplicates the functioning of sensor-based motion detection systems like Leap Motion and Microsoft Kinect without the sensing area limitations.



Statement by Secretary Hagel on the 40th Anniversary of the All-Volunteer Force

DoD Public Affairs

July 1, 2013


Forty years ago today, the Department of Defense implemented one of the most consequential and far-reaching changes in the history of America’s armed forces with the establishment of the all-volunteer force.

In commemorating the birth of our modern military, we recognize and thank the millions of men and women and their families who have served our country — on active duty, in the reserves, and in the National Guard. Skeptics and detractors claimed an all-volunteer force could not be sustained, but these remarkable Americans proved otherwise. They helped win the Cold War, stood against aggression in the Persian Gulf, kept peace in the Balkans, and put their lives on the line in Iraq and Afghanistan in the years since 9/11. They choose to serve our country because they believe in a cause that is bigger than themselves. They’ve always put the interestsof the nation first, as have their families. It’s because of their sense of duty that the American military remains the most trusted institution in our society today.

Our all-volunteer force faces challenges. It will need to rebalance, adapt, and evolve, as it has over the last 40 years. America’s all-volunteer force will continue to make our military the strongest, most capable, and most respected fighting force in the history of the world.



What Are We Going to Do About GPS Jamming?


By Bob Brewin July 1, 2013


Not much, based on this updated report from the Department of Homeland Security.

DHS prepared a classified report on Global Positioning System vulnerabilities in November 2012 and the unclassified version, released last week, leaves much to worry about, including the fact that “Detecting, locating and disabling sources of GPS disruption remain a challenge.”

The department suggests manual backups for GPS, which I imagine includes old-fashioned compasses and maps, but observed that “human skills for using manual techniques could erode due to lack of training and practice as GPS becomes more ubiquitous.”

GPS signals sit at the core of the Federal Aviation Administration’s Next Generation Air Transportation System, provide timing signals for wired and wireless networks, guide precision munitions, help mariners navigate tough harbor approaches and are key to precision farming operations.

But nowhere in the report does DHS suggest an automatic back-up system for the simple reason that one does not exist, even though the Department of Transportation’s John A. Volpe National Transportation Systems Center warned about the dangers of GPS jamming and called for development of an automatic back-up system in a report published 13 years ago.

The Volpe report suggested a terrestrial backup GPS system based on an improved version of the WW II Long Range Navigation System, known as Loran, but the United States abandoned Loran due to the manning costs incurred by the Coast Guard, which literally blew up the tower of the Port Clarence, Alaska, station in 2010.


South Korea, which has a lot of experience with GPS jamming by North Korea, plans to start installing a Loran system in 2016 with full operation planned by 2018 — a better approach than a compass or map.


Government Executive Defense Contractors and Military Personnel Can’t Fill In for Furloughed Civilians


By Charles S. Clark

July 1, 2013


During furlough days set to begin July 8, Defense Department managers may not “borrow military manpower” nor step up assignments to contractors to make up for idled civilian employees, a Pentagon official directed on Friday.

Planning around civilian furloughs and “total force management” requires that “component heads, installation commanders and line managers shall take steps to manage workload, but must ensure that borrowed military manpower is not used to compensate for work resulting from a civilian furlough,” F.E. Vollrath, assistant Defense secretary for readiness and force management, wrote in a memo to the joint chiefs, all undersecretaries and major departmental directors.

Borrowing labor “would be inconsistent with the secretary’s intent and the department’s commitment to protect the viability of the All-Volunteer Force,” he continued. “Additionally, in accordance with the department’s statutory requirements, contractors are prohibited from being assigned or permitted to perform additional work or duties to compensate for the workload/productivity loss resulting from the civilian furlough,” Vollrath wrote.

The policy on contractors was welcomed by the American Federation of Government Employees, whose national president J. David Cox Sr. on Monday issued a statement crediting Vollrath’s position, even though “Congress should have repealed sequestration months ago because it was a failed tactic never intended to be enacted, and Secretary [Chuck] Hagel never should have imposed furloughs on the Department of Defense’s reliable and experienced civilian workforce when there is ample room for reductions in service contract spending that is supposed to be temporary in nature.”

Cox recapped a Monday meeting with Vollrath, during which the union “pressed him to ensure that AFGE’s direct conversion concerns — i.e., when work performed by civilian employees is given to contractors illegally or to military personnel inappropriately — are resolved expeditiously. I asked him to investigate in particular a direct conversion at Anniston [Ala.] Army Depot where core depot maintenance workload is being illegally privatized,” Cox said.

Also on Monday, the Pentagon comptroller released updated fiscal 2013 budget numbers detailing planned cuts under sequestration, as requested in May by Senate Armed Services Committee Chairman Carl Levin, D-Mich., and Ranking Member James Inhofe, R-Okla. The new report lays out line items totaling some $41 billion in reduced spending for fiscal 2013, noting that the cost of preparing the new estimates was $38,000.

Levin was traveling and unavailable for comment.





Members of Top Nine Software Teams Move Forward from DARPA’s Virtual Robotics Challenge

by Staff Writers

Washington DC (SPX) Jul 01, 2013


The DARPA Robotics Challenge (DRC) was created with a clear vision: spur development of advanced robots that can assist humans in mitigating and recovering from future natural and man-made disasters. Disasters evoke powerful, physical images of destruction, yet the first event of the DRC was a software competition carried out in a virtual environment that looked like an obstacle course set in a suburban area.

That setting was the first proving ground for testing software that might control successful disaster response robots, and it was the world’s first view into the DARPA Robotics Challenge Simulator, an open-source platform that could revolutionize robotics development.

Disaster response robots require multiple layers of software to explore and interact with their environments, use tools, maintain balance and communicate with human operators. In the Virtual Robotics Challenge (VRC), competing teams applied software of their own design to a simulated robot in an attempt to complete a series of tasks that are prerequisites for more complex activities.

Twenty-six teams from eight countries qualified to compete in the VRC, which ran from June 17-21, 2013. DARPA had allocated resources for the six teams that did best, but in an interesting twist, good sportsmanship and generosity will allow members of the top nine teams, listed below, to move forward:


1. Team IHMC, Institute for Human and Machine Cognition, Pensacola, Fla. (52 points)

2. WPI Robotics Engineering C Squad (WRECS), Worcester Polytechnic Institute, Worcester, Mass. (39 points)

3. MIT, Massachusetts Institute of Technology, Cambridge, Mass. (34 points)

4. Team TRACLabs, TRACLabs, Inc., Webster, Texas (30 points)

5. JPL / UCSB / Caltech, Jet Propulsion Laboratory, Pasadena, Calif. (29 points)

6. TORC, TORC / TU Darmstadt / Virginia Tech, Blacksburg, Va. (27 points)

7. Team K, Japan (25 points)

8. TROOPER, Lockheed Martin, Cherry Hill, N.J. (24 points)

9. Case Western University, Cleveland, Ohio (23 points)


The top six teams earned funding and an ATLAS robot from DARPA to compete in the DRC Trials in December 2013 (DARPA is also funding several other “Track A” teams to construct their own robot and compete in the Trials). The Trials are the second of three DRC events, and the first physical competition.


In a demonstration of good sportsmanship, Jet Propulsion Laboratory, which also has a DARPA-funded Track A effort with its own robot, decided to merge its two efforts and offer the bulk of the resources it earned in the VRC to other teams. DARPA split the freed resources between the next two teams:

+ The robot associated with the JPL win and some funding now goes to TROOPER (Lockheed Martin).

+ Additional funds are being allocated to a newly formed team of Team K and Case Western. That team, now known as HKU, will use an ATLAS robot generously donated to it by Hong Kong University to participate in the DRC Trials in December.


Thus, in total, seven teams with ATLAS robots and DARPA support will be going to the DRC Trials, where they will compete with other teams with their own robots.

VRC teams were evaluated based on task completion and effective operator control of the robots in five simulated runs for each of three tasks (15 total timed runs) that addressed robot perception, manipulation and locomotion.

The tasks included: entering, driving and exiting a utility vehicle; walking across muddy, uneven and rubble-strewn terrain; and attaching a hose connector to a spigot, then turning a nearby valve. To simulate communications limitations in a disaster zone, the VRC imposed a round trip latency of 500 milliseconds on data transmission, and varied the total number of communications bits available in each run, from a high of 900 megabits down to 60 megabits.

To conduct the VRC, DARPA funded the Open Source Robotics Foundation to develop a cloud-based simulator that calculates and displays the physical and sensory behaviors of robots in a three-dimensional virtual space, in real time. The simulator allowed teams to send commands and receive data over the Internet to and from a simulated ATLAS robot-information very similar to what would be sent between a physical robot and its operator in the real world.

“The VRC and the DARPA Simulator allowed us to open the field for the DARPA Robotics Challenge beyond hardware to include experts in robotic software. Integrating both skill sets is vital to the long-term feasibility of robots for disaster response,” said Gill Pratt, DRC program manager.

“The Virtual Robotics Challenge itself was also a great technical accomplishment, as we have now tested and provided an open-source simulation platform that has the potential to catalyze the robotics and electro-mechanical systems industries by lowering costs to create low volume, highly complex systems.”


Immigration deal would boost defense manufacturers

Washington Post

By Matea Gold, Published: July 1


The border security plan the Senate approved last week includes unusual language mandating the purchase of specific models of helicopters and radar equipment for deployment along the U.S.-Mexican border, providing a potential windfall worth tens of millions of dollars to top defense contractors.

The legislation would require the U.S. Border Patrol to acquire, among other items, six Northrop Grumman airborne radar systems that cost $9.3 million each, 15 Sikorsky Black Hawk helicopters that average more than $17 million apiece, and eight light enforcement helicopters made by American Eurocopter that sell for about $3 million each.

The legislation also calls for 17 UH-1N helicopters made by Bell Helicopter, an older model that the company no longer manufactures.

Watchdog groups and critics said that these and other detailed requirements would create a troubling end-run around the competitive bidding process and that they are reminiscent of old-fashioned earmarks — spending items that lawmakers insert into legislation to benefit specific projects or recipients. In the past several years, Congress has had a moratorium on earmarks.


The language was included in a $46 billion border security package the Senate approved last week as part of a comprehensive immigration bill. The so-called border surge — an additional $38 billion in spending — was added in the final week of negotiations to attract more GOP support for the measure, which passed with 68 votes, including 14 from Republicans.

The legislation would spend $30 billion over the next decade to hire more than 19,000 new Border Patrol agents, an undertaking that would double the size of the force and that many immigration experts consider wasteful and unnecessary.

The measure also would devote $7.5 billion to build 350 miles of fencing along the U.S.-Mexican border and $4.5 billion to buy new border technology. The legislation would have to be fully implemented, along with electronic visa and employment verification systems, before immigrants could receive green cards.

Sens. Bob Corker (R-Tenn.) and John Hoeven (R-N.D.), who co-sponsored the plan, said the provisions were aimed at assuaging the concerns of Republicans who are wary about creating a path to citizenship without tougher border measures.

“I was just trying to work with our caucus to get as many of our guys to participate,” Hoeven said.

That approach did not win over holdouts such as Sen. Tom Coburn (R-Okla.), who said: “Taxpayer funds should enhance border security, not provide border stimulus for contractors. Unfortunately, the Senate bill does exactly that.”

The list of equipment included in the legislation was drawn from a technological needs assessment developed by the U.S. Customs and Border Protection agency in 2010, according to a senior Department of Homeland Security official, who spoke on the condition of anonymity to describe the internal process. Agency staff members compiled the list at the request of Homeland Security Secretary Janet Napolitano after she stopped a virtual-fence project that was plagued by cost overruns and delays.

Border Patrol officials provided the list to congressional staffers who had asked what the agency needed to effectively control the border.

In separate interviews last week, Corker and Hoeven said they decided to add the list to the legislation to help win over GOP senators who did not trust Napolitano to carry out a border plan.

The two senators noted that the proposal would allow Napolitano to substitute equivalent brands of technology as long as she notified Congress within 60 days. “If they want to buy something better, they can,” Corker said.

But critics said that because the measure prescribes specific products, the agency probably would not seek alternatives. “Lawmakers have put their thumb on the scale for particular products and technologies and that is hard for an agency to ignore,” said Steve Ellis, vice president of the nonpartisan Taxpayers for Common Sense, which scrutinizes federal spending.

The $4.5 billion set aside for technology would be a boon for defense contractors, who are looking for opportunities as the United States continues to reduce its presence in Afghanistan.

The parent corporations of the companies that manufacture the products listed in the bill and their employees have given nearly $11.5 million to federal candidates and campaigns since 2009, according to the nonpartisan Center for Responsive Politics. About half of that came from Northrop Grumman.

Neither Corker nor Hoeven has received substantial donations from the companies or the defense sector overall.

“We’re proud of our long partnership with the Department of Homeland Security and are honored they have repeatedly chosen to acquire our helicopters for their important missions,” said Ed Van Winkle, law enforcement sales manager for American Eurocopter. “We stand ready to produce and deliver additional aircraft customized to Customs and Border Protection requirements should Congress authorize and fund their procurement.”

Representatives of Northrop Grumman, Sikorsky and Bell declined to comment.

Most of the equipment required by the legislation is identified by category, not by brand. Among other items, the bill calls for 4,595 unattended ground sensors, 104 radiation isotope identification devices and 53 fiber-optic tank inspection scopes — and specifies how many should be deployed in each Border Patrol sector. It also requires the purchase of four new drones, on top of 10 unmanned aircraft that the Border Patrol already owns.

The items listed by name were identified that way on the border agency’s wish list, according to Senate staff members involved in drafting the plan, who discussed the process on the condition of anonymity. They said the proposal would not override contracting rules that require competitive bidding.

But government watchdogs said it would be difficult to have an open bidding process for equipment identified by brand and model.

“The agency is statutorily required to buy the specific items from the listed vendors,” said Scott Amey, general counsel for the Project on Government Oversight, an independent group that works to expose overspending and corruption. “I’m unsure how an agency could hold a competition.”

One big-ticket item on the list is the VADER radar system, an airborne technology operated from drones that Northrop Grumman developed for the Pentagon’s research arm. The Border Patrol has been testing one of the systems on loan from the Defense Department to detect migrants attempting to cross the border illegally, officials said. This year, the agency received $18.6 million to buy two of the radar systems , and the immigration bill would add six more.

The Black Hawk helicopters required under the plan include five of the latest high-tech models with digital cockpits. As for the American Eurocopter aircraft, the patrol would be required to add eight AS-350 models to the 85 it already has in its fleet.

The legislation spells out how new border patrol agents would be deployed, requiring the agency to assign 38,405 officers to the U.S.-Mexican border by Sept. 30, 2021.

The Border Patrol employs a record 21,000 agents, up from about 10,000 in 2004. In its most recent budget request, the department did not seek new agents.

Many experts on border security say that doubling the force is impractical and a poor use of resources and that the money could be better spent on workplace inspections or the E-Verify system that employers can use to check the citizenship of applicants.

“There is a lot in this border security plan that is fighting the last war,” said Doris Meissner, who was a top Clinton administration immigration official.

Homeland Security officials are confident that they can recruit and train the surge of agents required under the bill. Spokesman Peter Boogaard said the measure would “build on this administration’s historic border security gains.”

Hoeven and Corker said they settled on hiring 20,000 agents in large part because the number fell midway between proposals from other GOP senators.

“I wish I could tell you it was scientific,” Corker said, adding, “We felt like this was something that would get the job done.”

Alice Crites contributed to this report.


What They Told Us: Reviewing Last Week’s Key Polls

Ramussen Reports

Saturday, July 06, 2013


Americans still share the values enshrined in the Declaration of Independence 237 years ago and remain wary of too much government. It’s clear, too, that many aren’t happy with the government they’ve got.

Eighty-one percent (81%) believe “all men are created equal.” Ninety-two percent (92%) agree that all men are “endowed by their Creator with certain inalienable rights, among them life, liberty and the pursuit of happiness.” Seventy-two percent (72%) believe “governments derive their only just powers from the consent of the governed.”

But just 25% of voters think the federal government today has that consent.

More Americans than ever (63%) think a government that is too powerful is a bigger danger in the world today than one that is not powerful enough.

Thirty-eight percent (38%) believe the U.S. Constitution doesn’t put enough restrictions on what government can do. Still, 56% think the foundational document shouldn’t be tampered with, and another 33% believe it needs only minor changing.

Just 47% now believe the United States is a nation with liberty and justice for all, the lowest level measured in six years. Still, 77% say if they could live anywhere in the world, it would be the United States.

Scott Rasmussen’s weekly newspaper column notes that “only one-in-four voters today thinks our government has the consent of the governed.” He adds, “That’s a clear call for our government to change its ways and re-earn the trust of those it is supposed to serve. Those are the kind of attitudes that make the Political Class nervous. The fact that we expect more comes from the fact that we as a nation still embrace the Spirit of ’76.”

So how are representative democracy and our constitutionally guaranteed system of checks and balances working these days?

The U.S. Supreme Court finished its term with big decisions on voting rights, affirmative action and same-sex marriage. Following those rulings, public approval of the Supreme Court has fallen to the lowest level ever recorded in more than nine years of polling.

Just seven percent (7%) of voters think Congress is doing a good or excellent job. This marks the sixth consecutive survey that shows Congress’ positive ratings in single digits. Sixty-five percent (65%) believe that no matter how bad things are, Congress can always find a way to make them worse.

Republicans gained the edge over Democrats on the Generic Congressional Ballot for the week ending June 30, but this is the second time in the past three weeks that support for both sides was below 40%, something that hasn’t happened since June 2009. This is the 11th straight week that the parties have been separated by two points or less.

At the same time, voters like the balance of Congress having final review over the Executive branch’s decisions when it comes to the environment. The Environmental Protection Agency sent revised greenhouse gas regulations to the White House this past week, but 51% of voters think all EPA regulations should require congressional approval before they can be implemented. That’s up eight points from 43% in late December.

Speaking of the Executive branch, 52% of voters think it is good for the nation that the American people know more about the National Security Agency’s domestic surveillance programs, and 70% believe the phone and e-mail surveillance programs may have inappropriately violated the privacy of innocent Americans. But as for Edward Snowden, the whistleblower who disclosed the programs, just eight percent (8%) regard him as a hero. Thirty-two percent 32% now consider him a traitor. Most think he’s either somewhere between hero and traitor (34%) or that it’s too early to tell (23%). Those numbers reflect a slightly more negative view of Snowden compared to two weeks ago.

Forty-three percent (43%) rate the NSA’s secret surveillance of Americans’ phone and e-mail communications as a serious scandal. Thirty percent (30%) view it as an embarrassing situation but not a scandal, while 20% say it’s not big deal.

The White House announced late Tuesday that it will delay implementation of the employer mandate, a key portion of President Obama’s national health care law. As the date for implementation draws near, support for state implementation of the health care law is declining. Just 41% of voters now would like their governor to support implementation of the law, while 48% want their governor to oppose it. This marks a complete reversal from January when 47% wanted their governor to support implementation of the law and 39% were opposed.

As projected by the Rasmussen Employment Index, the number of new jobs in June – announced by the government yesterday – slipped slightly from the month before but remains near the highest levels of recent years. Sixty-six percent (66%) of Americans know someone who is out of work and looking for a job. But that’s the lowest it’s been since the Great Recession. Americans are evenly divided as to whether the labor market is better than it was a year ago and also divided as to whether the unemployment rate will be better or worse a year from now.

An increasing number (35%) of Americans think now is a good time to sell a home in their area, but belief that owning a home is a family’s best investment is down to 47%, the lowest level measured in the past year.

Consumer and investor confidence remain near recent highs.

Despite these signs of economic improvement, confidence in the president’s handling of the economy continues to fall. Thirty-five percent (35%) of voters now say Obama is doing a good or excellent job in this area, down from a recent high of 48% in December.

The president had a challenging month of June, and his total Job Approval rating fell two points to 47%. That ties his lowest approval rating since December 2011. The current rating is more in line with his job approval during most of his first term in office.

In other surveys this week:

– For the second week in a row, 30% of voters say the country is heading in the right direction.

Americans consider blacks more likely to be racist than whites and Hispanics.

– Americans continue to strongly believe private sector workers work harder for less money and have less job security than government workers do.

– Only 26% of voters rate the performance of public schools in America today as good or excellent. Sixty-one percent (61%) think most textbooks are chiefly concerned with presenting information in a politically correct manner, little changed from surveys for the past three years. 

– Just 25% think most high school graduates have the skills needed for college. Only 22% think high school graduates have the necessary skills to enter the workforce.

– Americans continue to see the Fourth of July as one of the nation’s most important holidays. It ranks second only to Christmas.

– Most Americans are aware that the Fourth of July celebrates the signing of the Declaration of Independence.

Remember, if it’s in the news, it’s in our polls.

June 29 2013




N.S.A. Leak Puts Focus on System Administrators

NY Times


Published: June 23, 2013

As the N.S.A., some companies and the city of San Francisco have learned, information technology administrators, who are vital to keeping the system running and often have access to everything, are in the perfect position if they want to leak sensitive information or blackmail higher-level officials.

“The difficulty comes in an environment where computer networks need to work all the time,” said Christopher P. Simkins, a former Justice Department lawyer whose firm advises companies, including military contractors, on insider threats.

The director of the N.S.A., Gen. Keith B. Alexander, acknowledged the problem in a television interview on Sunday and said his agency would institute “a two-man rule” that would limit the ability of each of its 1,000 system administrators to gain unfettered access to the entire system. The rule, which would require a second check on each attempt to access sensitive information, is already in place in some intelligence agencies. It is a concept borrowed from the field of cryptography, where, in effect, two sets of keys are required to unlock a safe.

From government agencies to corporate America, there is a renewed emphasis on thwarting the rogue I.T. employee. Such in-house breaches are relatively rare, but the N.S.A. leaks have prompted assessments of the best precautions businesses and government can take, from added checks and balances to increased scrutiny during hiring.

“The scariest threat is the systems administrator,” said Eric Chiu, president of Hytrust, a computer security company. “The system administrator has godlike access to systems they manage.”

Asked Sunday about General Alexander’s two-man rule, Dale W. Meyerrose, a former chief information officer for the director of national intelligence, said, “I think what he’s doing is reasonable.”

“There are all kinds of things in life that have two-man rules,” added Mr. Meyerrose, who now runs a business consulting firm. “We’ve had a two-man rule ever since we had nuclear weapons. And when somebody repairs an airplane, an engineer has to check it.”

John R. Schindler, a former N.S.A. counterintelligence officer who now teaches at the Naval War College, agreed that the “buddy system” would help. “But I just don’t see it as a particularly good long-term solution,” he said.


“Wouldn’t it be easier to scrub all your I.T.’s for security issues,” he asked, “and see if there is another Snowden?”

The two-man rule “has existed in other areas of the intelligence community for certain exceptionally sensitive programs where high risk was involved,” he said, “but it’s not a standard procedure.”

Mr. Meyerrose and Mr. Schindler both said that software monitoring systems can also help, though they can be evaded by a knowledgeable systems administrator. The biggest issue for government and industry, they said, is to vet the I.T. candidates more carefully and to watch for any signs of disillusionment after they are hired.

“It’s really a personal reliability issue,” Mr. Meyerrose said.

Insiders of all types going rogue have become a problem for the government and industry over the last decade. One of the most prominent is Pfc. Bradley Manning, who downloaded a vast archive of American military and diplomatic materials from his post in Iraq and gave it to WikiLeaks. But there have been others, including scientists and software developers who stole secrets from American companies where they worked and provided them to China.

Now the spotlight is on the system administrators, who are often the technology workers with the most intimate knowledge of what is moving through their employers’ computer networks.

Beyond their store of technical manuals to keep the system running, administrators at intelligence agencies can have access to specific top secret programs without necessarily being cleared for them, like other intelligence agents must be.

If they can get into one part of the network with credentials for what is called “root access,” they can get into almost everything else. They are known as the “super user.”

Since 9/11, the vast majority I.T. experts in the intelligence world have worked for private contractors, and the Snowden case has set off a new debate about whether the government could have more control of the workers if they were direct employees.

“This is a dirty little secret that’s being revealed,” said Robert Bigman, a former chief information security officer at the Central Intelligence Agency. “When you log on with a root account, it doesn’t know if you’re staff employee or a contract employee. It just knows you’re root. You’re known as a superuser. You have all privileges.”

At a New Jersey pharmaceutical firm in early 2011, a former I.T. administrator gained access to the company’s system, deleted several files — including those that tracked customer orders — and froze the firm’s operations for several days, causing $800,000 in damages. Prosecutors called it a revenge attack after the company, Shionogi, announced layoffs. The administrator, Jason Cornish, pleaded guilty in August 2011.

And in 2008, a network administrator for the city of San Francisco named Terry Childs found out that he was about to be laid off and modified the city’s network in such a way that only he held the password. He refused to hand it over for 12 days, effectively disabling everything from the mayor’s e-mail to the city’s payroll records.

Reuters has reported that Mr. Snowden had made many posts anonymously on an online forum, including one in 2010 in which he seemed critical of technology companies cooperating with government surveillance programs.

Mr. Schindler, the former N.S.A. counterintelligence officer, said that while a person’s political views are not considered in terms of security clearances, the reviews may need to be expanded to include Twitter posts and other online comments that could yield clues to a job candidate’s thinking.

He said the N.S.A. could also do what Soviet officials did after one of their cipher clerks defected in 1945.

“Their response wasn’t to crack down on code clerks, but to make them happier,” Mr. Schindler said. “They boosted their pay and gave them more reasonable hours, and they got no-knock privileges with the boss to keep them happy.”

Mr. Simkins, the former Justice Department counterespionage lawyer, said that it is “more difficult than it sounds” to address threats posed by rogue insiders.

“At the end of day, there’s no way to stop an insider if the insider is intent on doing something wrong,” he said. “It’s all about mitigating.”


U.S. worried about security of files Snowden is thought to have

Washington Post

By Ellen Nakashima and Greg Miller, Published: June 24

The ability of contractor-turned-fugitive Edward Snowden to evade arrest is raising new concerns among U.S. officials about the security of top-secret documents he is believed to have in his possession — and about the possibility that he could willingly share them with those who assist his escape.

It’s unclear whether officials in Hong Kong or in Russia, where Snowden fled over the weekend, obtained any of the classified material. A spokesman for the anti-secrecy group WikiLeaks, which has been assisting the former National Security Agency contractor, strenuously denied reports that foreign governments had made copies of the documents.

“This rumor that is being spread is a fabrication and just plays into the propaganda by the administration here that somehow Mr. Snowden is cooperating with Russian or Chinese authorities,” spokesman Kristinn Hrafnsson said in a phone interview Monday.

Nonetheless, in 2010 and 2011, WikiLeaks published hundreds of thousands of sensitive U.S. documents it obtained from Army Pfc. Bradley Manning, and co-founder Julian Assange suggested in a teleconference call with reporters Monday that the group was interested in gaining access to the documents Snowden had obtained.

“In relation to publishing such material, of course WikiLeaks is in the business of publishing documents that are supposed to be suppressed,” Assange said. He declined to say whether Snowden had shared any of the material.

The NSA has teams of analysts scouring systems that they think Snowden may have accessed, officials said. Analysts are seeking to retrace his steps online and to assemble a catalogue of the material he may have taken.

“They think he copied so much stuff — that almost everything that place does, he has,” said one former government official, referring to the NSA, where Snowden worked as a contractor for Booz Allen Hamilton while in the NSA’s Hawaii facility. “Everyone’s nervous about what the next thing will be, what will be exposed.”

Glenn Greenwald, the Guardian columnist who has published a series of stories based on documents provided by Snowden, said he has exercised discretion in choosing what to disclose. Snowden, too, has said he was selective in choosing what to disclose.

“I know that he has in his possession thousands of documents, which, if published, would impose crippling damage on the United States’ surveillance capabilities and systems around the world,” Greenwald told CNN. “He has never done any of that.”

The Guardian, Greenwald said on “Meet the Press” on Sunday, has withheld “the majority of things that he gave us pursuant not only to his instruction, but to our duty as journalists.”

Over the past several weeks, The Washington Post and the Guardian have published articles and portions of documents that describe two major surveillance programs. One, called PRISM, deals with the interception of e-mail and other Internet content of foreign terrorism suspects thought to be located overseas. The other involves the amassing of a database of Americans’ phone call records — numbers dialed and received, length of call, but no content — which can be searched for a specific phone number when there is “reasonable, articulable” suspicion of a terrorist plot or activity associated with the number.

A former senior U.S. official said that the material that has leaked publicly would be of limited use to China or Russia but that if Snowden also stole files that outline U.S. cyber-penetration efforts, the damage of any disclosure would be multiplied. The official, like others in this article, spoke on the condition of anonymity because the official wasn’t authorized to discuss the matters on the record.

U.S. officials said their assumption is that China and Russia have copied the materials that Snowden took from classified U.S. networks but that they had no way to confirm those countries had done so.

“That stuff is gone,” said a former senior U.S. intelligence official who served in Russia. “I guarantee the Chinese intelligence service got their hands on that right away. If they imaged the hard drives and then returned them to him, well, then the Russians have that stuff now.”

Hrafnsson said such assertions are unfounded. In his call with reporters, Assange insisted that neither the Chinese nor the Russians had debriefed Snowden during his stops in their territories.

Asked about the security of the trove of material thought to be on Snowden’s laptop and the possibility that it could fall into the wrong hands, Assange said: “Mr. Snowden’s material has been secured by the relevant journalist organizations prior to travel.” Asked if he could elaborate, he said, “I’m afraid I cannot.”

The damage assessment being conducted by U.S. officials is expected to take “a few months, at best,” said a senior intelligence official. “We’re looking for all of the information that was disclosed, and assessing the damage it may have caused in terms of national security sources,” he said.

A second senior intelligence official said there were concerns that disclosure of U.S. surveillance methods would make it easier for terrorist groups to avoid detection. “The more material that gets made public the more capability we lose,” the official said.

Already, several terrorist groups in various regions of the world have begun to change their method of communication based on disclosures of surveillance programs in the media, the official said. He would not elaborate on the communication modes.

“It’s frustrating,” he said. “Because if they find some other method to communicate, we go dark. And we miss dots. That’s not something we’re particularly excited about.”

Anthony Faiola in London and Philip Rucker and Julie Tate in Washington contributed to this report.


How Edward Snowden’s encrypted insurance file might work


By Zachary M. Seward @zseward June 26, 2013

Now we have a bit more clarity on what Edward Snowden meant last week when he said, “The US government is not going to be able to cover this up by jailing or murdering me. Truth is coming, and it cannot be stopped.”

It turns out Snowden has given copies of the files he purloined from the US National Security Agency, his former employer, to “many different people around the world,” according to Glenn Greenwald, the Guardian journalist who first published some of the materials provided by Snowden. But the files are encrypted, so the people who have the documents can’t read them. “If anything happens at all to Edward Snowden, he told me he has arranged for them to get access to the full archives,” Greenwald told the Daily Beast.

How might that work? Snowden could be using any of a number of complicated cryptographic gambits.

Cryptography is a gatekeeper. It allows us to check our bank accounts, sign into email, and browse Facebook without worrying that any of that data can be intercepted by others (the NSA surveillance revealed by Snowden notwithstanding).

The simplest form of cryptography that Snowden and his allies could be employing would involve one person keeping an encrypted copy of the files and someone else holding the key necessary to decrypt it. But that method is vulnerable, relying on the trustworthiness of the person who has the key, and it doesn’t sound like what Snowden has done.

More complex, more secure, and more interesting would be a form of “secret sharing.” Essentially, the files can only be unlocked if each member of a group shares his portion of the encrypted information; or, alternatively, if several people are given encrypted portions and a combination of, say, any three of them is sufficient to unlock the files. For instance, this illustration represents the data shared by three people as intersecting planes; the point where they intersect represents the secret that is unlocked when they’re shared:

It’s obviously more complicated than that, and there are other methods Snowden could be using. How well he knows advanced cryptography also isn’t clear, but he’s receiving assistance from Wikileaks, which is skilled in it. Wikileaks has also used its own form of “insurance files” in the past.

Snowden remains in diplomatic limbo in the transit area of Sheremetyevo airport in Moscow. As to what his insurance file contains, beyond what has already been revealed by the Guardian, Washington Post, and South China Morning Post, we have yet to find out; Greenwald told the Daily Beast that just the non-encrypted material Snowden gave him consists of “thousands of documents.”


Dozens of Security Clearance Reports Falsified

Office of Personnel Management IG Investigates Top Contractor

By Eric Chabrow, June 24, 2013. Follow Eric @GovInfoSecurity

The federal government has identified dozens of cases of alleged falsification of reports submitted by investigators examining individuals being considered for security clearances.

In testimony at a Senate hearing June 20, the inspector general of the Office of Personnel Management said there are 20 cases in which investigators – federal employees and contractors – were either found guilty or were about to plead guilty to falsifying security clearance reports. The office also is investigating dozens more cases, he said.

The hearing was called in the wake of revelations that former National Security Agency systems administrator Edward Snowden, who worked as a contractor for Booz Allen Hamilton, leaked information about top secret intelligence-gathering programs.

In a criminal complaint dated June 14, federal authorities have charged Snowden with theft of government property, unauthorized communications of national defense information and willful communications of classified communications intelligence information to an unauthorized person. News reports on June 24 say Snowden left Hong Kong over the weekend for Moscow and could be heading to Ecuador, where he would seek political asylum.

At that joint hearing of two subcommittees of the Homeland Security and Governmental Affairs Committee, OPM Inspector General Patrick McFarland confirmed the IG is conducting a criminal investigation of USIS, the largest contractor that conducts security-clearance investigations for the government.

Sen. Claire McCaskill, the Missouri Democrat who co-chaired the hearing, said the government knows Snowden received his clearance from USIS. “We know that their investigation encompasses the time he received his clearance,” said McCaskill, chair of the Subcommittee on Financial & Contracting Oversight. “We need answers.”

McFarland declined to provide details at the hearing on the IG’s investigation of USIS.

USIS Unaware of Criminal Probe

In a statement issued after the hearing, USIS said it was never informed by the inspector general that the company was under a criminal investigation, adding that it complied with a January 2012 IG subpoena for records and has cooperated fully with the government’s civil investigation.

USIS said in the statement: “Questions were raised as to whether USIS had conducted the initial background investigation, or a periodic reinvestigation, for the security clearance of Edward Snowden. USIS conducts thousands of background investigations annually for OPM and other government agencies. These investigations are confidential, and USIS does not comment on them.”

Susan Ruge, the IG’s associate counsel, told the Washington Post that USIS’s review of Snowden’s security clearance ended in April 2011 and that the IG began its probe of the firm later that year.

No Margin for Error

As of last year, about 3.5 million federal employees and 1.1 million government contractors held top security clearances. “Given the increasing number of folks with access to that information, we have a real problem on our hands if we can’t get this right,” said Sen. Jon Tester, D-Mont., who co-chaired the hearing with McCaskill. “There is no margin for error.”

Most individuals holding top security clearances don’t have the type of access to computer files that Snowden had. For example, military pilots who transport sensitive materials must gain security clearances.

Vetting individuals for top-secret security clearance is costly, topping $4,000 for each instance, according to the Government Accountability Office. A statement issued by McCaskill and Tester said OPM’s security clearance and background investigations cost the federal government about $1 billion in 2012; that annual cost is expected to rise to $1.2 billion by 2014. They said OPM spends 46 percent of its funds on the contractors who perform investigations, adding that about 75 percent of all field investigators are contractors, an estimated 4,600 out of 6,200 in total.

Falsifications Uncovered

McFarland, in his testimony, told the panel that the IG office had uncovered a number of falsifications of security-clearance investigation reports by federal employees and contractors who reported interviews that never occurred, recorded answers to questions that were never asked and documented records checks that were never conducted.

Since 2006, McFarland said, 18 background investigators and record searchers have been convicted of falsifying records, with a 19th investigator pleading guilty last month and a 20th investigator expected to plead guilty this month. The IG said his office is engaged in fabrication cases against nine other background investigators, with another 36 cases pending.

Of the 18 investigators convicted, 11 were federal employees and seven were contractors. Because of a lack of federal funding, McFarland said the IG’s priority is to investigate federal employees, so these 18 cases do not necessarily reflect the rate of falsification based on employers. “If that background investigation is not thorough, accurate and reliable, then all other decisions made related to the issuance of the security clearance are suspect,” he said.

The decision to grant a security clearance is made by government-employed adjudicators and not the background investigators.

Antiquated System Hindering Process

The way the federal government investigates security-clearance candidates makes the process ripe for possible falsification by some investigators, says Evan Lesser, who has closely followed the government’s security-clearance apparatus since co-founding the employment site in 2002.

Lesser, managing director, characterizes investigators more as data collectors because they’re required to stay close to an OPM script their given in their questioning. He says investigators go into the field with pencil and paper and not mobile computers or smart phones. The responses they receive are entered into an ancient DOS-based computer system, he says.

“If you look at some of the cases that have been brought against investigators who had falsified information, you often heard that they were under the gun and moving quickly in trying to get cases completed,” Lesser tells Information Security Media Group.

Background investigators are inundated with new guidance and regulations, he says. “Like any group of people, you’ve got your bad apples. They’re definitely a good group of people, no more or less patriotic than any other government employee or contractor. But their situation is somewhat unique by the fact that they do deal with antiquated technology, they do have high work loads and they’re pressured to get things done quickly.”

Age a Possible Factor

Lesser says the 30-year-old Snowden’s age may have played a significant role with him getting top-security clearance.

“The younger the clearance applicant, the less financial history they have, the less foreign travel and foreign connections they have,” Lesser says. “For older people, there’s a lot more data. For younger people, there’s just not whole out there.

“It’s entirely possible that the USIS investigator who talked to Edward Snowden did his or her job 100 percent perfectly and by the book [but] they weren’t able to get a whole lot of information about and from Mr. Snowden.”


U.S. energy companies seen at risk from cyber attacks: CFR report

WASHINGTON | Wed Jun 26, 2013 12:11am EDT


(Reuters) – U.S. oil and natural gas operations are increasingly vulnerable to cyber attacks that can harm the competitiveness of energy companies or lead to costly outages at pipelines, refineries or drilling platforms, a report said on Wednesday.

The energy business, including oil and gas producers, was hit by more targeted malware attacks from April to September last year than any other industry, said the Council on Foreign Relations (CFR) report, citing data from a Houston-based security company, Alert Logic.

Cyber attacks on energy companies, which are increasing in frequency and sophistication, take two main forms, the CFR report said. The first kind, cyber espionage, is carried out by foreign intelligence and defense agencies, organized crime, or freelance hackers.

These parties covertly capture sensitive corporate data or communications with the goal of gathering commercial or national security intelligence. U.S. energy companies are subject to frequent and often successful attempts by competitors and foreign governments to access long-term strategic plans, bids tendered for new drilling acreage, talks with foreign officials and other trade secrets, the report said.

A campaign against U.S. energy companies by hackers based in China, called Night Dragon by McAfee, a leading security company that is part of Intel Corp, began in 2008 and lasted into 2011. The campaign stole gigabytes of material, including bidding data in advance of a lease auction. One unidentified energy company official believes his company lost a bid in a lease auction because of the attack, the CFR report said.

Many companies are either unaware of similar attacks or are afraid to disclose them for fear of upsetting investors, it said.

“That’s too bad because it makes it harder for Washington to help them and it also makes it harder for the public to be aware of what threats are out there,” said Blake Clayton, a fellow in energy and national security at CFR and a co-author of the report.

The second main cyber risk to energy companies is the disruption of critical businesses or physical operations through attacks on networks.

“This has a lower probability but potentially higher cost,” said Clayton.

The Stuxnet virus, said to have been created by the United States and Israel to attack Iran’s nuclear program, is an example of a campaign that ended up escaping from its intended target at the risk of causing harm to a U.S. company. Chevron Corp said late last year it had been infected by Stuxnet, but said without elaborating the virus was quickly controlled.

An attack dubbed Shamoon last year on Saudi Aramco, Riyadh’s state oil company, ultimately disabled some 30,000 computers. The company said the attack was aimed at stopping oil and gas output at the biggest OPEC crude exporter.

Oil production was apparently unaffected, but damage could have been more severe had the attack penetrated further into the network, the report said.

Hackers from a group called “Cutting Sword of Justice,” suspected to be insiders, claimed responsibility for the attack, which was believed to have been delivered using a USB drive.


(Reporting by Timothy Gardner; Editing by Matt Driskill)


Drones evolve into a new tool for ag


Laurie Bedord 05/10/2013 @ 7:21am

Loss of pilots over hostile territory fueled the birth of the drone in 1918. Nearly a century later, the mention of the word evokes a negative connotation.

“I think the first thing you imagine when you hear the word drone is this image of a predator operating in Afghanistan with an assault weapon or missile strapped underneath,” says Rory Paul, of Volt Aerial Robotics.

Yet, these devices, which Paul refers to as unmanned aerial systems (UAS), have the potential to be more than just spies in the sky. A passionate advocate for the use of this technology in agriculture, Paul has been working to change that image from foe to friend.

“UAS are something more intelligent than just a target drone that is shot down by antiaircraft artillery,” he notes.

Though he’s been working to introduce the concept to agriculture since 2006, it’s only recently that the tides have turned.

“It is an area that has been ignored, but there’s a change in the focus that has happened in the last few months,” says Paul. “The big defense players are slowly turning their eyes toward the potential these devices hold for ag applications.”

The greatest possibilities, he believes, are in aerial imagery and data acquisition. He also thinks size won’t matter when reaping the benefits. “These systems will be operated by small and large farmers to acquire data when they want it,” he says.

Agent for change             

Despite the potential value UAS bring to agriculture, there are still challenges to navigate. Their use falls under the jurisdiction of the Federal Aviation Administration (FAA). It is in the process of developing rules and a plan for commercial use of UAS in national airspace by 2015, which is currently strictly prohibited.

As both a full-scale, instrument-rated private pilot and a model airplane enthusiast, Roger Brining has been flying model remote-controlled (RC) aircraft for recreational use under the rules and safety guidelines of the Academy of Model Aeronautics (AMA) since the 1970s.   

“Model aircraft have successfully and safely coexisted with full-scale planes for years,” he says. “The FAA has worked with the AMA for decades to ensure that safety is maintained. The catch is that all of these guidelines and safety programs specifically exclude any commercial use of RC aircraft. Once we get into a company charging a farmer to take aerial imagery, this becomes a commercial use.”

There is no question there will be FAA regulations on what can and cannot be done with sUAS,” says Brining, who farms in Great Bend, Kansas. “This is a must for their safe coexistence with manned aircraft.”

Gathering information              

Universities also see this tool as an ally for many in the industry.

“UAS can reduce equipment wear and tear as well as labor and fuel costs to get highly precise data you wouldn’t necessarily be able to gather going through the field at ground level,” says Kevin Price, professor of agronomy and geography at Kansas State University.

For over a year, Price, along with Deon van der Merwe, an associate professor at Kansas State’s college of veterinary medicine, have been collaborating to explore how unmanned technology can play a role in ag missions.

They have uncovered a wide range of uses with the help of two units: a RiteWing Zephyr II and a DJI S800 Spreading Wings hexacopter.

For example, they’re working with professors who do crop breeding with literally thousands of plant crosses. Seeds are planted in patches, and larger fields are filled with thousands of patches.

“Professors walk the field looking at each patch and its phenology, which is the way the plant looks – its height and shape,” Price says. This information is then used to estimate yields.

“Every patch is harvested, and seeds of the different varieties are weighed,” he continues. “It can take up to 1,500 hours of labor to get one phenotype, which is a bottleneck for moving the genetic breeding program along.”

To speed up the process, he looked to a spectroradiometer, which measures the energy coming off the plants in thousands of wavelengths.

“With two wavelengths – red and near-infrared – we can explain over 80% of the variability in yields on these thousands of phenotypes. If we can take those two wavelengths and build them into a camera, we can fly a field, take an image, and project the yield on every plot in minutes. We can ignore the bad plots and not have to collect that data. It’s going to save millions of dollars in research time.”

Beyond the crop, he can see UAS counting cattle, checking for water in the pond, or determining if blue-green algae that can kill livestock is present.

Other jobs that once took hours, if not days, are reduced to minutes. “We mapped an area of about 640 acres in 18 minutes,” he says.

The camera system they have, which is a Canon s100 converted to color infrared, takes a picture every four seconds.

“This provides us a lot of coverage, and the more coverage, the better,” he says.

Agisoft, a Russian software program, splices together the hundreds of images taken to create a mosaic. All of the fields over a 640-acre area are then pulled together into one large image at 1-inch resolution.

“I’m looking at individual plant leaves now,” Price says. “Images are going to get even better in the near future with the new camera systems coming out.”

Rugged business

Finding a device rugged enough to take some abuse and to not cause people to lose valuable equipment like cameras is another area his team is looking at.

The RiteWing Zephyr II is made of expanded polypropylene, which is high-grade engineering foam that has an elastic nature, allowing it to regain its shape.

“It is rugged and flies stable,” Price notes. “It also has fewer parts, which means less breakage.”

On a typical aircraft, the rudder and movable flaps called ailerons on the rear edge of each wing make a plane turn to the left or right. Moving flaps called elevators on the tail make the nose of the plane go up or down.

“The RiteWing Zephyr II has elevons, which combine the functions of the elevator and the aileron,” Price explains.

Besides fewer moving parts, other advantages include less mass, less cost, less draft, and faster control response.

To date, Price says they have spent about $25,000 on their equipment. However, through trial and error, he believes they could build a unit for less than $2,000.

“We tell farmers to expect to pay around $5,000, because they have to buy a radio, which is about $400,” he notes.

Taking flight

As the new owner of a RiteWing Zephyr XL (similar to the Zephyr II but with an 81-inch wing span), Brining has tried several forms of aerial imagery in the past.  

“I have used custom-flown, traditional aerial imagery,” he says. “The resolution was poor. The costs were extremely high (around $3.50 per acre per flight), and they didn’t get the flights accomplished in a timely enough fashion to meet my objectives.”

What intrigued him about this technology is its incredible flexibility, speed, and low cost for flights combined with a very high-resolution final product.

“I think the new system will let me make tactical decisions, which have all been done strictly based on ground scouting and sampling,” Brining adds.

He estimates the entire system will cost $5,000 to $7,000.

In the first year, he wants to get the system operational, learn to process images, and use the flights as a tool so his agronomists can make better use of their time by scouting the right parts of the field.

“I think it will also be extremely helpful in locating leaks in my subsurface drip-irrigation systems,” he adds.

Crunching the numbers

According to The Economic Impact of Unmanned Aircraft Systems Integration in the United States, precision agriculture is one of the markets with the largest potential for this technology. The document, which was released by the Association for Unmanned Vehicle Systems International (AUVSI), estimates that in just two years the economic and employment impacts of ag spending in all 50 states will be nearly $2.1 billion and creating more than 21,500 jobs.

UAS are tools to apply precision agriculture techniques,” says Chris Mailey, vice president of Knowledge Resources for AUVSI. “They are a key piece in the future of precision agriculture.”

However, he cautions, realizing those gains will be tied to certain factors.

“State incentives, like a tax incentive or a test site, may bring more jobs to one state vs. another,” he notes. “Legislation – positively and negatively – may affect the adoption of these devices in the short term. I believe that once any technology – no matter what it is – is proven out, it is much easier to get larger adoption.”

Every year that sales of the technology are delayed, the report says the U.S. stands to lose $10 billion.

Safety first

With more than 18,000 airports, 750 air traffic control facilities, and 4,500 air navigation facilities, national airspace is a huge system to manage. There are more than 238,000 general aviation aircraft in the system at any time. Maintaining safety in the sky is the FAA’s top mission.  

Mailey says it should be the mission of the ag industry, as well.

“Safety is going to be paramount,” he says. “UAS are just like a tractor or a truck. If you take your eyes off of it or do something not very smart, it can be a very, very dangerous device.”

The hybrid Samsung Ativ Q laptop has Windows 8 and Android Jelly Bean (hands-on)

  • presented by

CNET Editors’ Take

June 20, 2013 12:00 PM PDT

LONDON — It’s a hybrid device, transforming from a Windows 8 slate into a full-fledged laptop in one quick motion. That’s not all, though — at the tap of an icon it’ll boot into Android Jelly Bean, letting you swipe around all of your favorite apps from the Google Play store.

It’s packing the latest Intel Haswell Core i5 processor, with a 13-inch display boasting a whopping 3,200×1,800-pixel resolution. Samsung has yet to announce pricing or availability, but make sure to keep checking CNET for all the latest news.

The Ativ Q’s standard form is a rather chunky 13-inch slate. Lift the display up at the back, though, and it reveals a keyboard underneath. The screen folds up and locks into place behind the keys, allowing you to type and swipe in much the same way you would on any other touch-screen laptop.

It’s the same converting technique we saw on Sony’s Vaio Duo, but Samsung has taken it a step further. Rather than converting just between laptop and slate, the screen can lie flat, lifted above and parallel to the keyboard. I’m not entirely sure why you’d want to keep it in that position — rather than have it lying flat in tablet mode — but I guess it’s useful to have the option.

More helpful though is the option to flip the screen all the way over, turning it, essentially, into a tablet on a stand. The screen automatically rotates as you turn it over. Samsung reckons this mode is best to use for presentations in meetings or just for watching video when you don’t want the keyboard exposed.

Samsung also includes its S Pen stylus for handwritten notes or doodling when you’re bored. It’s the same smaller variety you’ll get on the Galaxy Note 10.1, meaning it’s not particularly comfortable to hold, but does at least slot neatly into the base.

The converting motion seemed fairly smooth in my brief hands-on time, although I worry that over time the small hinges might not be able to put up with much punishment. That’s particularly important, as the rear stand is actually where the CPU is housed. While that’s an interesting space-saving design, it potentially puts the delicate components at more risk from knocks and bumps. We’ll give it a full stress test when we get it in for review.

Samsung’s Ativ-apalooza

It’s a 13-inch machine with physical proportions roughly the same as any other 13-inch ultrabook’s. It’s 14 millimeters thick, which is satisfyingly skinny, it’s small enough to slide easily into most backpacks, and at around 1.3kg (2.8 pounds), you won’t struggle to carry it around for long periods either.

The keyboard’s keys are quite small, so it might not suit those of you with hands the size of continents. There’s no touch pad like you’d find on a normal laptop either. You’ll have to make do with the little trackpoint in the middle of the keys, or just use the touch screen.

The 13.3-inch display boasts an astonishing 3,200×1,800-pixel resolution, which is the highest pixel count we’ve seen on a 13-inch machine. It gives it a pixel density of 275 pixels per inch — better than the 9.7-inch Retina iPad‘s 264ppi, and a massive amount for such a big display.

Unsurprisingly then, everything looked absolutely pin-sharp. It’s bright, too, and colors seemed rich and vivid. I wasn’t able to spend a lot of time with the screen, so I’ll leave my final judgment for the proper review, but suffice to say I’m excited to see my own high-resolution photos on it.

Windows 8 and Android Jelly Bean
Like James Bond’s gadget guru, Samsung’s own Q has a trick up its sleeve. It’s first and foremost a Windows 8 laptop, but at the tap of a tile, it can boot into Android Jelly Bean. At first glance this might seem a bit of a gimmick, but when you take a moment to think about it, it actually makes a lot of sense.

If you’re struggling to decide whether to splash your cash on a Windows laptop, a Windows tablet, or an Android slate, Samsung might have just the solution in the Ativ Q.

You can work in Windows 8 as on a regular laptop, taking advantage of the familiar Windows layout and Office tools. When the busywork is done, switch over to Android to play touch-screen games and casually swipe around any of the hundreds of thousands of apps from the Google Play store. Even if that doesn’t appeal, the sheer number of additional Android apps goes a fair distance to make up for the lack of good titles in the Windows Marketplace.

The Android portion runs on the same Intel Core i5 processor as the Windows portion, so performance should be satisfyingly zippy. The chip is the latest Haswell silicon from Intel, which promises better battery life than its predecessor — again, I’ll test this properly in the review.

Switching between the two operating systems is a simple task of hitting a tile on the Windows 8 desktop. The switch takes a couple of seconds at most and thankfully doesn’t require a restart. Going back to Windows is simply a matter of hitting the Windows home key beneath the display. The operating systems also share file folders, meaning anything you save in your gallery in Android will be available in your pictures folder when you return to Windows.

Interestingly, Samsung hasn’t tried to apply any of its usual software additions over the top of Android. Instead, you get the pure vanilla Jelly Bean experience. It’s the full-fat version of Windows onboard, too, rather than the hobbled Windows RT version, which was specifically designed for tablets. That means you’re able to install any normal desktop software.

You can’t blame the Ativ Q if it struggles to understand itself. It’s at once a Windows tablet, an Android tablet, and a Windows laptop. Samsung evidently hopes this combination will be the perfect solution for people who want a device to suit both work and play.

Whether that’s really the case or if it’s actually a confusing hash of products that should remain separate entities remains to be seen. I’m certainly looking forward to finding out which it is, though.


B-1 school sees pilots losing currency during sequestration

by Airman 1st Class Peter Thompson

7th Bomb Wing Public Affairs


6/24/2013 – DYESS AIR FORCE BASE, Texas — A unit responsible for producing elite B-1 Lancer weapons officers has had its budget cut nearly in half due to sequestration.

The 77th Weapons School at Dyess Air Force Base, Texas, took this as a challenge and opportunity to maximize efficiency.

“We have taken notice of the mentality that we need to have at this time,” said Lt. Col. Brian Vlaun, 77th WPS director of operations. “Efficiency is closely linked to a call to find innovative solutions, which applies to us by maximizing the training we have in our syllabus.”

Without funding to supply flying hours for the squadron, instructors at the 77th WPS won’t be able to complete their yearly proficiency requirements, which allow them to fly as instructors, aircraft commanders or mission leads.

“Based on the last time each of our instructors flew, their currencies will last them through the middle of July,” said Lt. Col. Jonathan Creer, 77th WPS commander. “Bottom line is we won’t be able to fly.”

The weapons squadron, and U.S. Air Force Weapons School at Nellis Air Force Base, Nev., were forced to cancel a full six-month course, which normally produces a batch of officers trained in applying multiple weapons platforms and strategies to the battlefield.

When the first class of 2013 was cut short in May, the squadron took a day to evaluate their situation and plan steps they would take to maximize efficiency with their almost non-existent budget.

The weapons school has four main objectives to work toward until they regain funding. Their first and primary objective is to update their syllabus.


Normally, the 77th WPS takes the short amount of time between classes to assess their syllabus and make adjustments as needed. This revision process is now the squadron’s main priority.

The first class of fiscal year 2014 will learn from the enhanced legacy syllabus. The second class of the year will be the first at the 77th WPS to train using the B-1 Bomber’s newest upgrade, Sustainment-Block 16, which is the most advanced hardware and software enhancement to date for the bomber.

The squadron’s second objective is to upgrade their tactics, techniques and procedures manuals.

“We will work with the 337th Test and Evaluations Squadron to develop new manuals that will be used to employ the upgraded aircraft,” Creer said.

Their third goal is to simultaneously provide cadre and instructors to support combat operations, and operational testing and training across the B-1 fleet. Instructors have transitioned their efforts from teaching classes to providing their expertise to 9th Bomb Squadron operators as they prepare for future deployments.

“We have the expertise available to help other squadrons with their academics and training,” Vlaun said. “It is our duty to provide for the B-1 community.”

To meet their fourth and final objective, 77th WPS instructors are prepared to continue producing the Air Force’s best weapons officers, and maintain currencies once funding is available.

Looking forward, the 77th WPS’s intent is to create opportunities to grow and learn from their present situation. Their outlook is that even though they have a minimal budget, they are still operational.

“I tell my guys all the time, this is not a throttle back, it’s a change in roles and duties,” Creer said. “We challenge them to do better, exceed their expectations, think critically, receive and give criticism, communicate more effectively and become problem solvers.”


Air Show Turnout is among lowest in history

Lack of military planes, not crash, seen as reason.

Dayton Daily News

By Barrie Barber

June 25, 2013

DAYTON — Attendance at the Vectren Dayton Air Show dropped sharply over the weekend, bringing just 23,000 people through the gates in the lowest turnout in the show’s recent history.

Air Show General Manager Brenda Kerfoot could not say whether the number of spectators was the lowest in the 39 years the show has been in Dayton, but turnout was less than half of last year and less than a third of what it was in 2009 and 2010.

Last year a withering, record-setting heat wave shrank turnout to 47,000 as the Navy’s Blue Angels performed. This year weather wasn’t a big factor, but there was no jet team because of federal budget cuts, and Saturday’s tragic crash may have been a factor in Sunday’s smaller-than-usual crowds.

The show, which often draws 70,000 people or more in a weekend, turned tragic Saturday when stunt performer Jane Wicker, 44, and pilot Charlie Schwenker, 64, were killed after the 1941 Stearman biplane they were in crashed while Wicker was performing her signature aerial wing-walking act near show center at Dayton International Airport.

Kerfoot attributed the sharp attendance decline to the scrubbed appearance of the Air Force Thunderbirds and no other active military aircraft on the grounds or in the sky because of federal budget cuts known as sequestration. The air show fi lled the lineup with well-known civilian acrobatic aerial acts.

“We really think the low numbers are an effect of sequestration (and) that it had nothing to do with the crash,” she said.

The National Transportation Safety Board has launched an investigation into the biplane accident that could take six months to a year. A preliminary report on the crash is expected this week.

John Cudahy, president of the International Council of Air Shows based in Leesburg, Va., said air shows nationwide have had fewer spectators this season because of the no-show of popular military jet teams and solo aircraft demonstrations.

“I think that the lack of military assets at air shows around the country have been a drain on attendance, not just in Day-ton but at a number of shows this year,” he said.

More than 60 air shows with an estimated $400 million economic impact have cancelled because of sequestration, according to Cudahy. The Dayton/ Montgomery County Convention & Visitors Bureau has fi gured the Dayton Air Show has a $3.5 million economic impact with 70,000 visitors.

“The air show is very iconic for our region,” said Jacquelyn Powell, president and CEO of the Convention & Visitors Bureau. “Many people when they think of Dayton, they think of the Dayton Air Show.”

Like Kerfoot, Cudahy does not think the fi ery crash before thousands of people contributed to the fall in attendance.

“Historically, that has not been the case,” he said of air show crashes impacting attendance. “Crashes that have happened in the past, when there was still a day left in the air show,” were not a big factor in attendance.

Kerfoot said Dayton Air Show leaders haven’t reached a decision on the size or scope of the 2014 show with the Blue Angels as the headline act.

The Pentagon has not decided whether military jet teams and planes will › y at air shows next year, but the Blue Angels have booked appearances in 35 places in 2014.

“I think that’s a No. 1 factor we need to determine before we determine what kind of show to do,” Kerfoot said. “I think the (Dayton Air Show) Board (of Trustees) and the community support for an air show is very strong. We have a fund balance to weather bad years, but obviously you can’t keep doing that.”

She did not release how much money the air show has on hand to cover potential losses.

Navy spokesman Lt. John Supple said the Blue Angels are eager to › y in 2014. “We’re hoping the show stays on,” he said. “The Blue Angels are itching to get out there and perform.”

Cudahy said his organization has pushed Congress to let military planes return to air shows. “I think the military recognizes they cannot suspend their interaction with the American public indefi nitely,” he said.

Air show o› cials do not know yet if last weekend’s show made or lost money while accounts continue to be tallied, Kerfoot said. The show spent about $1 million this year, or around a third less than normally budgeted in anticipation of fewer spectators.

Vectren, an Evansville, Ind.-based energy company that has sponsored the show since 2001, and Cincinnati headquartered supermarket giant Kroger, signed three-year deals last year to sponsor the air show through 2015.

“We wanted to be a part of what was at that time a premier event for the Miami Valley and we believe that’s still the case,” Vectren spokeswoman Chase Kelley said Monday. The air show crash and jet team no-shows have not aff ected the sponsorship, she said.

“Those are the exceptions rather than the rule and we still want to be a partner,” Kelley said.



The U.S. Military Aircraft That Flew in Paris


by Brendan McGarry on June 24, 2013


PARIS — Aviation enthusiasts were quick to spot the few American-made military aircraft that did fly at this year’s Paris Air Show, 17-23 June.

No fixed-wing plane currently operated by the U.S. military took to the skies. Drone-maker General Atomics brought a new Predator B, better known by its Air Force designation, MQ-9 Reaper. But the unmanned vehicle remained grounded.

Two other U.S. planes flew, including the World War II-era P-38 Lightning fighter and the C-121 Super Constellation transporter, both made by the predecessor of Lockheed Martin Corp. But those types of propeller-driven craft completed their final military missions decades ago.

The only aircraft in U.S. service today that flew at the event was an export version of the UH-60 Black Hawk helicopter, made by Sikorsky Aircraft Corp., part of Hartford, Conn.-based United Technologies Corp.

The U.S. fighter fleet, including the F-15, F-16, F-18 and F-35, was entirely absent.

The U.S. drastically scaled back its presence at the world’s biggest international air show, as the Defense Department froze spending on such events amid federal budget cuts. The move allowed European arms makers, especially Russia, to take center stage.

Still, Pentagon officials and U.S. company representatives attended the event to capitalize on upcoming opportunities in locations such as Eastern Europe, the Middle East and the Asia-Pacific region.

United Technologies Corp.’s Sikorsky brought the S-70i to the show to market the chopper to potential international customers, especially Poland. The country next year plans to pick a firm to build as many as 70 combat support helicopters in a potential $3 billion deal that’s among the biggest opportunities on the international rotorcraft market.

United Technologies Corp.’s Sikorsky is competing for the order against AgustaWestland, part of Rome-based Finmeccanica SpA, and Eurocopter, part of Leiden, Netherlands-based European Aeronautic Defence & Space Co.

General Atomics plans to sell an unarmed version of its Predator unmanned system to the United Arab Emirates and other countries in the Middle East as part of a plan to boost international sales, a vice president said.

The drone, called the Predator XP, is equipped with radar and sensors to offer wide-area surveillance but not weapon systems such as laser-guided bombs or air-to-ground missiles, according to Christopher Ames, director of international strategic development for General Atomics Aeronautical Systems Inc., based near San Diego.


The company made an effort to display a new Predator B at the show, Ames said. “I’m told we’re one of the only U.S. companies displaying an actual aircraft,” he said. “We worked hard to make it happen.”

The classic planes were also brought to the show by the private sector. The P-38 is actually the restored White Lightnin’ aircraft owned by the Austrian company, Red Bull GmbH, which makes the popular energy drink, Red Bull. The C-121 “Connie” is owned by the luxury Swiss watch maker, Breitling SA.



China’s Fighters, Drone Look Like U.S. Aircraft


by Brendan McGarry on June 20, 2013


PARIS — China’s models of military planes at the Paris Air Show bear resemblance to U.S. aircraft, drawing attention to the rising concern in the Defense Department that the country is using cyber espionage to obtain sensitive defense technology.

The state-run Aviation Industry Corporation of China had a large exhibit of military and civilian models of aircraft at the show, held outside Paris at the historic Le Bourget airfield.

The display included three fighters and a drone: a single-seat version of the FC-1, a single-engine fighter built for the Pakistani air force and designated JF-17; a dual-seat variant of the FC-1 in development; the dual-seat, twin-engine L-15 trainer; and an unmanned system called Wing Loong.

The fighters looked like the F-16 made by Lockheed Martin Corp. and the drone bore resemblance to the MQ-1 Predator made by General Atomics Aeronautical Systems Inc., without the inverted tail. The Chinese drone is designed as a low-altitude craft that can fly up to 16,500 feet and loiter for 20 hours.

A spokesman from the Beijing-based corporation was quick to note that the FC-1 is “a lot cheaper than the F-16,” though he declined to provide a figure. The man gave a brief overview of the systems to Military​.com but declined to be named, citing corporation policy.

Notably missing from the exhibit was any display of the J-20, China’s classified stealth-fighter program.

During talks this month at an estate in Rancho Mirage, Calif., President Barack Obama reportedly warned the new Chinese President Xi Jinping that cyber attacks against the U.S. threaten the two countries’ strategic relationship. Xi insisted China is also the victim of computer hacking.

Obama faced pressure to raise the issue after the recent leak of a classified section of a Defense Department report showed that designs for the most advanced U.S. weapons have been compromised by suspected Chinese hackers. The list of weaponry includes the F-35 Joint Strike Fighter, the Littoral Combat Ship, and the Aegis Ballistic Missile Defense System, among others.

The Pentagon in its latest annual assessment of China’s armed forces for the first time blamed China directly for targeting its computer networks. The attacks were focused on extracting information, including sensitive defense technology.

“In 2012, numerous computer systems around the world, including those owned by the U.S. government, continued to be targeted for intrusions, some of which appear to be attributable directly to the Chinese government and military,” it states. “The accesses and skills required for these intrusions are similar to those necessary to conduct computer network attacks.”

That document also concluded that the People’s Liberation Army, or PLA, considers the strategy of “information dominance” a critical form of defense against countries that it views as “information dependent,” such as the U.S.

China called the accusations “groundless” and “not in line with the efforts made by both sides to strengthen mutual trust and cooperation,” according to a May 9 article published on the state-run website, “People’s Daily Online.” The country is a “victim itself of cyberattacks,” it states.

A Chinese espionage group since 2006 has stolen hundreds of terabytes of information from at least 141 companies across 20 major industries, including aerospace and defense, according to a February report from Mandiant, a closely held company based in Alexandria, Va., which sells information-security services.


Where Has All The Nuke Waste Gone?


 James Conca, Contributor

6/24/2013 @ 11:30AM |1,599 views

A funny thing happened on the way to our high-level nuclear waste dump. Most of America’s high-level nuclear waste is no longer high-level. This is a good thing. Unfortunately, on paper it’s still called high level waste. On legal paper, no less. The bar code says it’s high-level waste so that’s what it is, right?

Wrong. Various processes have changed the nature of this waste over the last 50 years. However, various human laws only consider where it came from and what it was a long time ago, not what it is now.

What happens when human law collides with natural law?

First, it always ends up costing us a lot of money. Second, we always take too long to change the human law, which is why it ends up costing us a lot of money.

So, the bar code says it’s high-level waste. What does that mean?

There are four general categories of nuclear waste in the United States (figure below): commercial spent nuclear fuel (SNF), high-level nuclear waste (HLW) from making weapons, transuranic waste (TRU) also from making weapons, and low-level radioactive waste (LLW) from many things like the mining, medical and energy industries. A minor amount of other radioactive wastes are sprinkled among these categories.

Nuclear and radioactive waste comes in four different flavors, defined in the text, that are treated and disposed of in different ways for very different costs. However, most of the high-level waste (HLW) is no longer high-level, a technical distinction, but one that is worth $200 billion.

SNF is the hottest waste, primarily from two isotopes, Cs-137 and Sr-90, both with approximately 30-year half-lives, making the waste high-level for less than 200 years. Similarly for HLW – it’s the Cs-137 and Sr-90 that make it hot, although not so much as SNF. LLW is not very hot at all. TRU waste spans the gamut from low-level to high-level, and is primarily determined by the amount of plutonium, while the level of hotness is again determined by the amount of Cs-137 and Sr-90.

The details get a little complicated, but HLW refers to waste with high levels of radioactivity that was generated from reprocessing nuclear fuel from weapons reactors to make atomic, and then nuclear, weapons. HLW is defense waste, and much of it is nasty, gooey, watery sludge with the consistency of peanut butter or week-old pudding.  Tricky to handle.

This is very different from commercial spent nuclear fuel (SNF) that has even higher levels of radioactivity but comes from commercial
power reactors and is dry and solid. Easy to handle.

TRU waste is a combination of debris, cements and sludge, a real mish-mash of materials that just has enough plutonium in it to call it TRU, but not enough Cs-137 and Sr-90 in it to call it HLW. Not so tricky to handle.

TRU and HLW tank waste are both generated in the reprocessing of spent fuel from a weapons reactor (not a commercial power reactor), but are differentiated by when in that process they were generated (figure below; definitely click on this one to see the detail). HLW is generated in the early steps of the process that remove the fission products, i.e., the pieces left over when the uranium or plutonium nucleus splits, or fissions, particularly the Cs-137 and Sr-90. TRU is generated in the following steps that separate and concentrate the Pu for weapons, and has much less Cs-137 and Sr-90.

The HLW and the TRU sit at various Department of Energy sites around the country in liquid, sludge and solid forms, but most of the HLW is in those famous old humongous tanks up at the Hanford site that hold 57 million gallons of it, recently made more famous by a few leaks. Not any threat to human health and the environment, but enough to get everyone upset (The First Amendment Bombs Nuclear Energy By Accident).

The strange thing is, we have long ago removed much of the Cs-137 and Sr-90 from these HLW waste tanks, and the rest has been through a couple of half-lives, so there’s not enough Cs-137 and Sr-90 to make it HLW anymore, at least in reality, scientifically.  Instead, most of it is now TRU waste (see figures). But it’s still legally HLW.

High-Level Waste (HLW) and transuranic waste (TRU) in tanks at the Hanford site were originally defined by which step they came from in the reprocessing of weapons fuel to make nuclear weapons. The steps that removed most of the Cs-137 and Sr-90 produced HLW, while the steps that removed most of the Pu produced TRU. However, we have since removed so much Cs-137 and Sr-90 from the HLW, and the rest has decayed away so much, that the HLW tanks now are TRU. Source: the Department of Energy.

What we have here is a classic bureaucratic knot of conflicting definitions that we need to fix in order to stop spending money just treading water.

The President’s Blue Ribbon Commission, which included the new Secretary of Energy Dr. Ernest Moniz, already laid out a plan to correct this mess in a series of well-crafted recommendations (Chris Helman, Forbes; BRC Report to President Obama). First, put SNF in interim storage for decades. This allows it to be separated from the defense HLW. Then the other recommendations can lead us down a reasonable path.

But we have to call the waste what it is. I know that requires a hefty bureaucratic lift since many groups are ideologically wedded to the past. But, come on, please don’t say we can’t even define anything anymore by what it actually is.

The multitude of laws and orders developed over the last 50 years has given us all the language and solutions we need, from the Ronald W. Reagan National Defense Authorization Act to the Nuclear Waste Policy Act, from wording by the House Armed Services Committee to the Nuclear Regulatory Commission’s 10 CFR Part 61, to various DOE Orders. We just have to revisit them and merge them into a coherent set of definitions.

Changing laws and agreements is very difficult these days, but is still a lot easier and cheaper than ignoring reality and treating HLW that is no longer high-level. The cost of physically and chemically treating TRU as though it’s HLW is very expensive and unnecessary. The difference is about $200 billion, a lot of money to spend on a legal technicality.

This is more than an academic exercise since we’re, you know, broke.


What’s the matter with Microsoft?

Experts search for meaning in three Microsoft U-turns

Gregg Keizer

June 25, 2013 (Computerworld)


Microsoft’s had a tough year, and it’s not even half over.

The software giant has retreated from flubs in licensing, the design of its flagship Windows operating system and most recently, innovations it wanted to bake into its new game console, the Xbox One.

On Wednesday, in fact, the highlight of the BUILD developers conference opening keynote will certainly be Windows 8.1, an upgrade that Microsoft casts as a customer feedback-driven refresh, but that some outsiders see as a reversal, even a repudiation, of its first-pass design.

What’s going on? Is the company’s decision-making suddenly fundamentally flawed? As the PC industry goes through its largest-ever slump, is it so desperate that it’s trying to milk revenue wherever it can by forcing change — even when it knows customers will rebel? Has it taken to hauling up the white flag at the first sign of resistance rather than toughing it out, as the old Microsoft might have?

Companies make mistakes all the time, sometimes crippling ones that drag them under. But if the organization is large enough, robust enough, it survives, learns. Ford weathered the Edsel, Coca-Cola New Coke, Netflix its Quikster, Apple the 1985 ousting of Steve Jobs, 2010′s Antennagate and last year’s Maps fiasco. But the pace of Microsoft’s missteps and the resulting turnarounds — three in the span of four months — is unusual.


In March, Microsoft retreated from a sweeping change in its licensing for retail copies of Office 2013, giving way after customers complained that they’d be labeled lawbreakers for trying to move the software from one machine to another. In late May, Microsoft revealed some of the changes slated in Windows 8.1, including the restoration of something very close to the iconic Start button. And last week, Microsoft quickly backed off Xbox One plans that would have nixed sales of used games and required the console to “phone home” daily to Redmond’s servers.


Too focused on money?

Industry analysts and other experts had all kinds of answers for the questions raised by Microsoft’s miscalculations. Some saw a company blinded by a desire to squeeze the last dollar out of customers, or one that thought aping Apple would be a winning strategy. Others faulted it for not anticipating what, in hindsight at least, was guaranteed blowback.

“People don’t like revocations of the physical rights they assign to property, even when we’re talking about software licenses, not software ownership,” said Wes Miller, an analyst with Directions on Microsoft, of the Office 2013 and Xbox One used-game errors. “We have an essence of tangibility, a feeling of ownership, when we buy a floppy disk or buy a CD, or even download a file.”

But by restricting that ownership, and doing that suddenly, Microsoft stepped into a morass when it told customers they couldn’t move Office to a different PC or said that Xbox games could not be resold. It violated that feeling of ownership, which customers interpreted as stealing something rightfully theirs.

“People don’t adjust well to change when that change means less rights and freedom than before,” Miller said, using words that could have been spoken by Boston radicals like Samuel Adams in 1774.

Others echoed Miller on the difficulty of changing behavior and Microsoft’s apparent belief that customers would willingly accept change, as evidenced in statements by Microsoft executives that Windows 8 users would quickly grow comfortable with its far-reaching alterations.


Users don’t like change

“It’s very hard to make monumental business changes in this day and age,” said Peter LaMotte, an analyst with Levick, a Washington, D.C.-based strategic communications consultancy. “People are used to certain functionality, certain interfaces, and it’s very difficult to take those things away from them.”

In other words, Microsoft either didn’t view those rights, implied or not, in the same way as did customers — a failure of one kind — or ignored evidence to the contrary, an error of quite another dimension.

Philip Morton, a senior practitioner in gaming at Foolproof, a U.K. user interface (UI) and user experience (UX) design consultancy, wasn’t sure which it was — though he leaned toward the latter — but he was certain Microsoft screwed up on the Xbox One.


People will accept change, Morton said, if it’s clear there are benefits to doing so that outweigh the burden of the change itself. While that may read as obvious, Microsoft either forgot it or knowing it, plunged ahead anyway.

“Microsoft had a carrot and a stick, but it was all stick and no carrot,” Morton said of the Xbox One plans, which were pitched as a way to simplify sharing games within a family or group of friends, and to make a customer’s game library available from any Xbox console. “Xbox has been successful despite Microsoft, not because of it,” he said. “[With the Xbox One] there was too much Microsoft in the Xbox. Too much of the traditional Microsoft had a say in that decision. They thought more of their business requirements and what the business wants than what the customers want. They didn’t communicate any benefit to the changes, and treated customers like criminals.”

Forcing changes onto customers — not, for instance, giving them an option, as Microsoft has by maintaining traditional “perpetual” licenses even as it pushes Office 365 subscriptions — was the final straw, said Morton.


Blindsided by backlashes

Nor did Microsoft seem to anticipate the backlashes to any of the three changes — another failure, said experts.

“Consumers are more vocal now,” said Miller, citing social media’s amplification of complaints. “In each case, Microsoft made a bold statement, but then had to rescind it after a Twitter outcry.”

LaMotte of Levick concurred. “If you release something with new functionality, consumers are prone to give feedback fast and furious. That’s the benefit and the downside of social media.”

That was especially true in the Xbox One affair because of the unique nature of the gaming community. Gamers are passionate about what they want, identify personally with the software, much more so than, say, users of Windows or Office. And they’re already organized, so to speak, because of the way many network to play online.

They’re also a different demographic group, LaMotte argued, one that likes to complain.

“Gamers love to share their opinion and share their disgust,” LaMotte said. “But Millennials are especially vocal about what they don’t like. It’s almost as if the movement picked up steam just to make Microsoft reverse the decision, no matter what an individual thought. People who grew up in the 60s or 70s, 80s and 90s, they had things to rebel against. Millennials don’t. So they find things to rebel against.”

Patrick Moorhead, principal analyst with Moor Insights & Strategy, saw Microsoft’s moves differently. “Sometimes companies believe that they’re smarter than everyone else,” he said. “Apple made that strategy feasible by being successful without soliciting consumer reaction. I think Microsoft saw that and said, ‘That’s how we can be successful, too.’”


Microsoft isn’t Apple

Moorhead found hints of Microsoft’s mimicry in its relatively-recent penchant for secrecy, a change itself from decades of being far more transparent. “Microsoft could have done a better job [in these cases] by asking people beforehand,” Moorhead asserted. “But they’ve become more isolated, more … insular … as it relates to people who they used to get feedback from, like analysts and the press.”


What works for one company, Apple for example, doesn’t necessarily work for another, like Microsoft. “Their initial reaction [to critics] of Windows 8 was that ‘We know better,’” said Moorhead. “There was no admission that they’d made a mistake or flexibility whatsoever.”


That didn’t go down well.

On the Xbox One, at least, Morton thought that Microsoft’s miscue may have stemmed from incorrect assumptions of the market. In its initial presentation of the Xbox One, Microsoft focused on the device’s non-gaming traits, particularly its television viewing features. “They build up a house of cards with an incorrect assumption of who would be the purchaser,” Morton said.

Sony, which portrayed its new console, the PlayStation 4, as the anti-Xbox One, pitched its hardware to gamers, as a game machine, with the hope that others in the family would use it, too. Microsoft’s mistake was taking the opposite tack.


Lessons learned?

Even with the missteps, several of the experts said, there’s evidence that Microsoft has learned lessons. Some encouraged Microsoft not to give up on its long-term strategy, even in the face of the three failures.

“They were shooting for the future,” said Miller, of the original Xbox One and Windows 8 decisions. “And I agree with them. They had to do the changes.” It’s inevitable, he said, that games will go all digital, all served via downloads, and that Microsoft’s Windows 8 shift to emphasize mobile was necessary to stay relevant.

Moorhead believed Microsoft has improved its responses to faux pas, even in the last few months. “I do get a sense recently that Microsoft’s taken a softer tone, and admitted that they didn’t get it right,” said Moorhead, referring to the Xbox One and Windows 8 retreats. “The addition of the Start button [to Windows 8.1] was at least some admission that they’re not perfect.”

But Miller wondered what the reaction to Microsoft’s moves meant in the long term, and not just for the Redmond, Wash. developer. “The world may not be as ready for cloud services as some might want them to be,” Miller said, pointing to Xbox One. “If [Xbox One and Office] are indicative of Microsoft’s longer-term goals, are they achievable? And will consumers follow?”

None of the experts dared predict the exact nature of the future, but pointing to the pain of change, some cautioned other companies to learn from Microsoft’s experiences. “It’s the times in which we live,” said LaMotte. “If you’re going to allow the world to beta test your products, you’d better be ready for the feedback.”

This article, What’s the matter with Microsoft?, was originally published at



Highlights of Obama’s plan to cut carbon

Washington Post

June 25, 2013

President Obama unveils a broad plan aimed at curbing climate change and its impacts in a speech Tuesday at Georgetown University.

President Obama’s full climate action plan (PDF)


Curbing carbon pollution

• Directs the EPA to establish carbon pollution standards for both new and existing power plants.

• Promises $8 billion in loan guarantees for fossil fuel projects.

• Directs the Interior Department to permit 10 gigawatts of wind and solar projects on public lands by 2020.

• Expands the president’s Better Building Challenge, helping buildings cut waste to become at least 20 percent more energy efficient by 2020.

• Sets a goal to reduce carbon pollution by at least 3 billion metric tons cumulatively by 2030 through efficiency standards set for appliances and federal buildings.

• Commits to developing fuel economy standards for heavy-duty vehicles.

• Aims to reduce hydrofluorocarbons, highly potent greenhouse gases.

• Directs agencies to develop a comprehensive methane strategy.

• Commits to forests and other landscape protection.


Preparing for climate change

• Directs agencies to support local investment to help vulnerable communities become more resilient to the effects of global warming.

• Establishment of flood-risk reduction standards in the Hurricane Sandy-affected region.

• Will work with the health-care industry to create sustainable, resilient hospitals.

• Distribution of science-based information for farmers, ranchers and landowners.

• Establishment of the National Drought Resilience Partnership to make rangelands less vulnerable to catastrophic fires.

• Climate Data Initiative will provide information for state, local and private-sector leaders.


Leading global efforts to address climate change

• Commits to expanding new and existing initiatives, including those with China, India and other major emitting countries.

• Calls for the end of U.S. government support for public financing of new coal-fired power plants overseas.*

• Expands government capacity for planning and response.

*Except for efficient coal plants in the poorest countries, or for plants using carbon capture.



Will Congress Let USAF Abandon the Global Hawk?

Defense News


Jun. 24, 2013 – 06:00AM |

By ARAM ROSTON         


The Global Hawk has provided high-altitude, long-endurance ISR for the Air Force since the late 1990s, but the service says it no longer needs the unmanned aircraft.

June is the start of the rainy season in the South Pacific, six months of storms that come in fast and unpredictable. And when the wind starts blowing, that takes its toll on U.S. intelligence-gathering far off in North Korea.

A substantial amount of the intel on the Hermit Kingdom comes from the three massive Global Hawk unmanned surveillance planes based at Andersen Air Force Base in Guam. Because of special flight restrictions, the Global Hawks can’t fly over thunderstorms, nor, without a way to see the clouds ahead, can they go around them. So whenever a hint of bad weather arose on the route Global Hawk was assigned last year from Guam, the missions were canceled. Last year, the UAVs were grounded for an entire month, says a source with knowledge of the operation.

This susceptibility to South Pacific cyclones is adding new energy to the political hurricane raging in Washington over the future of the expensive UAVs.

It’s been a year and a half since the Air Force said it no longer needs the Global Hawk. The service argued that the UAVs, each built for more than $200 million, don’t do their jobs as well as the time-tested U-2 manned spy plane. So the Air Force wants to take the entire fleet of 18 Global Hawks and park them in the “boneyard” — the aircraft storage facility at Davis-Monthan Air Force Base, Ariz. That’s the functional equivalent of throwing 135 tons of the world’s most advanced robotic flying machines into the trash heap.

Now the battle lines are forming in what may be an epic contracting war. On the one side, swinging hard, is Global Hawk-maker Northrop Grumman. It has some powerful arguments, and it has members of Congress who say the Air Force needs to fall in line. On the other side is the Air Force, fighting to keep the U-2, which was built by Lockheed Martin.



At 70,000 feet, a U-2 pilot flying northwest along the boundary of North Korean airspace can turn his head to the right, and through the visor of his spacesuit he will see the silhouette of Earth’s curvature. Then he will see a silent green phosphorescent flash before the sky suddenly goes dark.

They call that flash “the terminator.” No U-2 pilot ever forgets it. Until just two years ago, the U-2 program itself — the workhorse of high-altitude intelligence, surveillance and reconnaissance for 60 years — was due to be terminated, too.

For a time, the Global Hawk versus U-2 debate revolved around age. The U-2, its critics said, was of a different era, before UAVs. After all, any pilot flying the U-2 now wasn’t even born when the program started back in 1955.

But now, as one Air Force pilot points out, “This is not your grandfather’s U-2.” For example, today’s U-2S jets have pressurized cockpits, although the pilots still wear spacesuits in case anything goes wrong.

Lockheed Martin’s Robert Dunn said the U-2S has a long way to go before it needs to be decommissioned. “The airplanes we are flying today are certified to 75,000 flight hours. The average airframe is 14,000,” he said.


If the U-2 is the aging champion, then in the other corner of the ring is the upstart Global Hawk. A feat of modern engineering, the autonomous plane can fly for 32 hours straight when conditions are right. That’s far longer than the U-2, though not as high and with a smaller payload.

Ironically, the now-costly Global Hawk program was birthed during the cutbacks of the Clinton years. The Air Force was enthusiastic about its huge, high-flying UAV, and it pushed for more and more capacity for the planes. The first operational lot, the Block 10s, couldn’t carry enough weight, so the next generation was bigger and more ambitious. It was about more sensors, more power, more payload.

Initially pitched as a $35 million aircraft, costs ballooned over the years by 284 percent, according to the Congressional Research Service. Much of that was due to the Air Force’s shifting requirements. (It’s now estimated at about $220 million per plane including development costs.)

The Air Force, for a time, was the Global Hawk’s biggest cheerleader, although the history has been complex and sometimes contradictory.

In early 2011 for example, the Defense Department’s director of operational test and evaluation said “the system was not operationally effective for conducting near-continuous, persistent ISR operations.”

Then, in June 2011, shortly before the Global Hawk was fielded, Air Force officials certified the project as “essential to national security.” It was meant to ensure that Congress continued to fund the program, but the proclamation would begin to haunt the service just months later.



In January 2012, the Air Force announced a drastic turnaround: It would terminate the Global Hawk program.

It provoked a firestorm — and a heavy public advocacy campaign on Capitol Hill by those who support the plane. Like many major modern weapons, its subcontractors are widely distributed across the United States, ensuring a broad base of political support. Northrop Grumman’s website notes that all but 15 states manufacture some part of the Global Hawk.

Experts were confounded that the Air Force had changed its mind so quickly.


And Congress put its foot down.

In the 2013 National Defense Authorization Act signed earlier this year, Congress told the Air Force it would have to fly the Global Hawks it had already (16 plus two being built) through the year 2014. The service “shall maintain the operational capability of each RQ-4 Block 30 Global Hawk unmanned aircraft system belonging to the Air Force or delivered to the Air Force.”

And to make sure no Global Hawk went on to the boneyard, the act was specific: No money “may be obligated or expended to retire, prepare to retire, or place in storage an RQ-4 Block 30 Global Hawk unmanned aircraft system.”

All of which sets the stage for the current conflict on the Hill.

Meanwhile, the 2013 Defense Appropriations Act went further. The service had resisted ordering new planes, on the assumption that by the time they were delivered, they’d be going right to the boneyard. Now the Air Force was told to go order three of the planes that had previously been budgeted for in 2012. “The Secretary of the Air Force shall obligate and expend funds previously appropriated,” for the plane.


But the Air Force has resisted. As another officer said, “Why are they making us spend money on something we don’t want or need?”

That attitude has irked some Northrop Grumman supporters on Capitol Hill.

In May, Rep. James Moran, D-Va., and Rep. Buck McKeon, R-Calif., wrote a stinging letter to Defense Secretary Chuck Hagel demanding that the Air Force do what it was told.

“The Air Force has continued to ignore clear Congressional intent,” they said.

And the House Armed Services Committee in June voted for a new defense authorization bill that would force the Air Force to use the Global Hawks until 2016



Here’s a side-by-side comparison of the two platforms:


■ Power. The U-2′s engine, with 17,000 pounds of thrust, can push the plane beyond 65,000 feet within a half hour. “It climbs like a homesick angel,” said a U-2 pilot. The Global Hawk, powered by an engine with just 7,500 pounds of thrust, can take four hours to reach its ceiling of 60,000 feet, critics say.


■ Endurance. Global Hawk is the hands-down winner. It can fly up to 32 hours before returning to base. Some say that’s what matters. “This is no time to be getting rid of your long-range, long-endurance assets,” said Rebecca Grant, an analyst who has done work for Northrop Grumman. The U-2 is stretching it to fly 14 hours; more typical flights last 10. But its defenders note that the manned plane can be based closer to the action, say, in South Korea, where flight restrictions bar unmanned aircraft.


■ Altitude. Here, U-2 is the king, with a publicly disclosed ceiling of 70,000 and a true ceiling somewhere about 75,000 feet. Global Hawk tops out at 60,000 feet. For the Air Force, this has become the central issue. First, the U-2 gets above the weather. The worst storm in the world is “just fireworks below,” said a pilot. But the other issue is visibility. Simple geometry allows the U-2 to see farther into enemy territory than the Global Hawk. That really makes a difference. A ceiling of 60,000 feet versus 70,000 doesn’t sound like much but look at it this way: The main job of the plane in the near future will be flying over the borders of countries like China and North Korea from international airspace. The Air Force likes to see 80 or 100 miles into adversaries’ territory, and the U-2′s added height lets it do that.


■ Sensors. That’s what it’s all about. At first glance, the Global Hawk has the edge. It carries three sensors for its intelligence missions, and the U-2 carries only two. On top of that, the Global Hawk can switch in midflight between electro-optical and synthentic aperture radar. “To have the ability for a single weapons system to carry a SAR radar, electro-optical package, and SIGINT package,” said Tom Vice, Northrop Grumman’s president of Aerospace Systems, “it allows to you to fuse all three different types of intelligence products together at the same time.”


But the Air Force says the U-2 has a far better electro-optical sensor that gives it a hands-down win in the category. In a report to Congress this spring, the Air Force flatly said that “the current U-2 sensors are superior to those of the GH.” Key to that is a camera called SYERS II (Senior Year Electro-optic Reconnaissance System) manufactured by UTC Aerospace. It’s multispectral, unlike the Global Hawk’s camera, and it sees farther.


■ Price. The U-2s were all built years ago. It’s a bit like owning a 2000 Honda Accord — it’s already paid for, it will keep on going and it drives great. The Global Hawks, on the other hand, are still coming off the production line. But Northrop Grumman argues that most of the development costs have already been spent anyway, and the kinks of building a new system have only recently been ironed out. The Air Force says at this point that it is just spending good money on a system that doesn’t have what it takes.


As for operating costs, they are equivalent — $33,500 per hour. But as Northrop Grumman points out, the Global Hawk doesn’t need training flights and requires fewer takeoffs and landings. Even the Air Force, in a recent report, acknowledged that “the persistence advantage of [Global Hawk] manifests itself in lower execution costs.”

Among its various proposals, Northrop Grumman has made one that stands out. It is offering to provide a 10-year contractor logistics contract for the Global Hawk Block 30 for $250 million, as a fixed price. It made the offer, though, months after the Air Force decided to terminate the program.



There is much disagreement on how much it would cost to upgrade the Global Hawk Block 30s, where there are shortfalls that need addressing. Take the sensors. The Air Force reported to Congress that “Upgrades to the GH Block 30 to achieve parity with the U-2 program require an expenditure of approximately $855 million.”

It might not be able to fly as high, but at least it could photograph as clearly.

Northrop Grumman’s defenders, eager to get the Air Force to change its mind, say the service is way off the mark. The company has offered to put better cameras on the Global Hawk for just $48 million.

“We’ve looked at that and we’ve addressed it,” Vice said. “We looked at how to open up our architecture. We’ve offered a firm fixed-price offer to the U.S. Air Force to integrate the SYERS sensors onto Global Hawk. And that would cost the Air Force only 6 percent of what the Air Force believed it would cost to upgrade the current Block 30 cameras. Guaranteed price; no risk to the government.”

Northrop Grumman’s $48 million versus the Air Force $855 million is an unresolved discrepancy, for the moment. One reason it can work: The company wants to simply remove the cameras from the competition — essentially cannibalizing the U-2.

As for the Global Hawk’s getting grounded in places like Guam, where it can’t be relied on during the rainy season, the plane’s supporters say that’s the Air Force’s fault in the first place because of onerous restrictions. Supporters argue that requiring the plane to fly 10,000 feet over clouds, and limiting it to one route was the problem that caused it to be grounded excessively.

Now it’s been given alternative routes, which supporters say will cut back on canceled missions.

The difficulty has been that Global Hawk is unmanned, without “sense and avoid” technology to meet air traffic requirements. Normally, a pilot could see the clouds and steer around them, but without a pilot, the Global Hawk can’t do that.

Northrop Grumman has told the Air Force it can put “weather diversion” cameras in the Guam-based Global Hawks. That way, the operators back at base will be able to see the clouds and reroute, just as they could if the pilot was flying.

The company pitched the idea to the Air Force, offering to install the cameras for $7 million.



There are some analysts who believe that in spite of the Global Hawk’s shortfalls, the Air Force is making a mistake. .

“However you cut it, I think there is a good case for Global Hawk Block 30,” says Mark Gunzinger of the Center for Strategic and Budgetary Assessments. “The reasons cited for retiring the Block 30s don’t stand up under scrutiny. It’s worth questioning.”

But if the Air Force is really being disingenuous in terminating the Global Hawk, as its critics say, what would be the motive? That’s where the Northrop Grumman defenders are having a difficult time.

Is it, perhaps, a lingering bias against drones, a preference for the swaggering days of the piloted plane? At a House hearing in May where he castigated the Air Force for its decision on Global Hawk, Moran said as much: “The U-2, as you know, has a pilot. And I suspect that’s the real issue — the pilotless versus the piloted craft, even though the U-2 has been around longer than even some of the members of this subcommittee have been alive.”

Air Force Chief of Staff Gen. Mark Welsh protested: “Pilot being in the airplane had absolutely nothing to do with it. I couldn’t care less. We want the platform that will do the best job of accomplishing the mission assigned — manned or unmanned — and we’ve said that all along.”

And after all, the Air Force has hundreds of UAVs and continues to develop new ones. It’s a hard to argue that the service simply doesn’t like unmanned aviation any more.

If not a bias against planes, others say that it is just stubbornness: The Air Force has dug itself into an untenable position and because of bureaucracy, is unwilling to back down, they say.

Still, that does seem like a stretch, given what’s at stake. If the Air Force still says it doesn’t need to spend the hundreds of millions of dollars on a program it finds inadequate, it will be hard to argue with that in an era when sequestration is cutting everyone’s budget.


The Atlantic

What Your Email Metadata Told the NSA About You

Rebecca Greenfield Jun 27, 2013

President Obama said “nobody is listening to your telephone calls,” even though the National Security Agency could actually track you from cellphone metadata. Well, the latest from the Edward Snowden leaks shows that Obama eventually told the NSA to stop collecting your email communications in 2011, apparently because the so-called StellarWind program “was not yielding much value,” even when collected in bulk. But how much could the NSA learn from all that email metadata, really? And was it more invasive than phone data collection? The agency is well beyond its one trillionth metadata record, after all, so they must have gotten pretty good at this.

To offer a basic sense of how StellarWind collection worked — and how much user names and IP addresses can tell a spy about a person, even if he’s not reading the contents of your email — we took a look at the raw source code of an everyday email header. It’s not the exact kind of information the NSA was pulling, of course, but it shows the type of information attached to every single one of your emails.

Below is what the metadata looks like as it travels around with an email — we’ve annotated the relevant parts, based on what The Guardian reported today as the legally allowed (and apparently expanded) powers of the NSA to read without your permission. After all, it’s right there behind your words:

As you can see, at the bare minimum, your average email metadata offers location (through the IPs), plus names (or at least email addresses), and dates (down to the second). The Guardian‘s Glenn Greenwald and Spencer Ackerman report that Attorney General Michael Mukasey and Defense Secretary Bob Gates signed a document that OK’d the collection and mining of “the information appearing on the ‘to,’ ‘from’ or ‘bcc’ lines of a standard email or other electronic communication” from, well, you and your friends and maybe some terrorists.

But email metadata is more revealing than that — even more revealing than what the NSA could do with just the time of your last phone call and the nearest cell tower. For operation StellarWind, it must have been all about that IP, or Internet protocol, address. Hell, it’d be easy enough for your grandma to geolocate both parties from a couple of IPs: there are countless free services on Google that turn those numbers you give to the IT guy into your exact location. For example, using the two IP addresses in the email sent to me above, we can easily determine that it was sent from Victoria, Australia:

The IP address is like a homing pigeon, and that’s why the revelations of email metadata being authorized under the Bush and Obama administrations amounts to a seriously revealing breach of personal security in the name of terror-hunting. “Seeing your IP logs — and especially feeding them through sophisticated analytic tools — is a way of getting inside your head that’s in many ways on par with reading your diary,” Julian Sanchez of the Cato Institute told The Guardian. Of course, the administration has another party line, telling the Los Angeles Times that operation StellarWind was discontinued because it wasn’t adding up to enough good intelligence of “value.” But with one of the many “sophisticated analytic tool” sets developed by the NSA over the last decade or so and leaked during the last month — like, say, EvilOlive, “a near-real-time metadata analyzer” described in yet another Guardian scoop today — America’s intelligence operation certainly can zero in on exactly where Americans are. Even if you’re just emailing your hip grandma.


Too much collaboration is hurting worker productivity



By Vickie Elmer    June 26, 2013    

Companies love collaboration—it’s become the go-to approach to solve corporate problems and spur innovation. Yet by emphasizing it at the expense of solitary work, employers choke worker productivity and satisfaction.

A new report by Gensler, the global workplace design and architecture firm, found that workers are spending more time in focus work but feel less effective at it than before.

“Collaboration can be taken too far. It actually has diminishing returns,” said Diane Hoskins, co-CEO of Gensler, in an interview with Quartz. “When everybody’s collaborating around you, you can’t focus.”

In the last few years, increased collaboration is both intentional, encouraged by managers intent on fostering innovation and shared resources, and unintentional, partly the result of corporate cutbacks in office space during the recession. Much of the reduced space affected collaboration areas, which pushed conversations and collaboration into the general work spaces, said Hoskins. “Everything was squeezed” and so workers felt less able to focus, the Gensler survey of 2,035 knowledge workers shows.

Now they feel even more crowded and unable to focus as corporate payrolls have inched up. Some feel that with more workers on “kitchen tables” or desks in close proximity they can never get anything done.

“If you diminish focus, it’s like the house of cards starts to fall apart. It’s almost foundational,” she said.

Certainly other research has found open floor plans can make workers less productive and more likely to get sick.

So what are companies doing to create places where workers can concentrate on their tasks? Intel’s Software and Services Group gives workers four hours of “think time” tracked on a group calendar so they can block out distractions and tune in on important problems or work. Office furniture maker Steelcase has created a gathering space equipped with teleconferencing devices, information projections and a round table.

Gensler is encouraging balanced arrangements so workers can have a few different environments to use depending on the mode they’re in. One company put up a C-shaped pod in the middle of the desks with room for four or five people to sit and share. Others have created small meeting rooms “where you can take conversations to” so workers at their desks can focus, Hoskins said. Some even set up outdoor gathering places for informal meetings or break times together.

Or some just may go home to get quiet focused time, and then come into the office for meetings and social connections. Those who can choose where to work still spend about 70% of their time in the office, Gensler reported. Workers who can choose their environment are more satisfied with their job, and rate themselves as more effective, especially in their focus work.

DirecTV has the edge right now among bids to buy Hulu


By Gina Chon and Kevin J. Delaney

June 28, 2012

Satellite television provider DirecTV appears to have the leading bid for video streaming service Hulu, according to sources familiar with the sale process.

Other contenders include Guggenheim Digital Media and a joint bid from the Chernin Group and AT&T, the sources said. Final, binding offers for Hulu had been due today, but the deadline has been extended to Tuesday, in part to give DirecTV more time to assemble its acquisition package.

The competition for Hulu, which wasn’t able to fetch significant offers when it first went up for sale in 2011, demonstrates how various segments of the media industry suddenly see value in owning their own platforms. Hulu has about 4 million subscribers paying about $8 a month for a mix of movies, TV shows, and original content. It also offers some video for free.

Sources say DirecTV has an advantage because, as one of the largest multi-system operators (MSOs) in the United Sates, it’s already a major customer of Hulu’s owners—Disney, Comcast, and News Corp. DirecTV could use Hulu to diversify its offerings or even to create a cable service delivered entirely over the internet. It currently operates over satellite as well as the web.

But the Chernin Group, led by former News Corp. president Peter Chernin, and Guggenheim, headed up by former Yahoo interim CEO Ross Levinsohn, are seen as more entrepreneurial. That could help ease concerns about Hulu losing its innovative spirit and strong engineering team if it were sold to a corporate buyer like DirecTV.

Hulu owners are hoping to fetch a price in the $1 billion range and if they don’t get such offers, they may rethink a sale, sources said. But at least some of the bids are expected to hit that price range, sources said. Yahoo, Amazon, and private equity firm Silver Lake also considered separate bids for Hulu, but their interest has waned, sources say.

There’s also still a possibility that Hulu’s owners avoid an outright sale by bringing in another media company such as Time Warner Cable or Time Warner, through an investment. Some analysts have advised against selling. Part of the reason Hulu is on the block, though, is that its owners have disagreed over what strategy to pursue.

Hulu declined to comment.


What They Told Us: Reviewing Last Week’s Key Polls

Rasmussen Reports

Saturday, June 29, 2013

With the nation’s 237th  birthday just days away, it’s good to remind ourselves that for most Americans, there’s still no place like home.

Eighty-six percent (86%) are proud to be an American. Seventy-four percent (74%) believe, generally speaking, Americans should be proud of the history of the United States.

Seventy percent (70%) believe that Americans have more freedom than people in other countries. Sixty-nine percent (69%) feel Americans have more economic opportunity than people in other countries.

Not that we wear rose-colored glasses. Just 40% now agree with Abraham Lincoln and Ronald Reagan that the United States is “the last best hope of mankind.” Thirty-six percent (36%) disagree.

Only 45% of voters believe the U.S. justice system is fair to most Americans, and just 34% think the system is fair to the poor.

In his latest weekly newspaper column, Scott notes that in our country, “public opinion leads, and politicians lag a decade or two behind. It’s always been that way.” He adds that “it’s easy to get discouraged about politics in the 21st century. Most voters view our government as a threat to individual rights. But we can take hope from the fact that Americans still embrace the ‘Spirit of ’76.’ “

Seventy-two percent (72%) of voters, however, suspect that the National Security Agency may have violated one of the country’s most cherished constitutional standards – the checks and balances between the three branches of government – by spying on the private communications of Congress and judges.

Recognizing that there is a tension between national security and individual rights, just 12% believe the recently disclosed NSA surveillance program is too concerned about individual rights. Forty-three percent (43%) believe the program is tilted too far in favor of national security.

The government may be understandably confused, however, over which domestic terrorists it’s supposed to be tracking. Among voters who approve of the president’s job performance, just 29% see radical Muslims as the bigger terrorist threat to the United States. Twenty-six percent (26%) say it’s the Tea Party that concerns them most. Among those who Strongly Approve of the president, more fear the Tea Party than radical Muslims. As for voters who disapprove of Obama’s performance, 75% consider radical Muslims to be the bigger terrorist threat.

If the government is listening, it doesn’t appear to be hearing to what most Americans are saying.

Take illegal immigration, for example. Sixty percent (60%) of voters favor an immigration plan that gives those here illegally legal status to stay, provided the border is really secured. But just 28% are now even somewhat confident that the government would actually secure the border and prevent illegal immigration if the reform plan is passed. That’s down from 45% in January.

In large part because of this voter distrust, just 37% expect the immigration plan passed by the Senate Thursday to make it through the full Congress and become law.

Voters are evenly divided as to whether the determination that the border is secure should be made by border states or by the federal government.

The immigration plan passed by the Senate also includes quite a few “pork barrel” spending projects, even though 65% of voters continue to believe the government should cut spending rather than increase it to help the economy.

Most voters also still view nuclear weapons as critical to the country’s safety which is why just 27% agree with President Obama’s call on Monday for a reduction in the U.S. nuclear arsenal.

Voters disagree, too, with the president’s decision to involve the United States more deeply in the civil war in Syria. U.S. troops haven’t been sent in there yet, but most voters continue to believe American political leaders put U.S. troops in harm’s way too often.

The president this past week proposed tighter carbon dioxide emissions controls on existing power plants. Nearly half of voters like that idea, but just as many think the proposed regulations will hurt the economy. Most expect those regulations to drive up energy costs. 

The economy and health care again this month top the list of 15 voting issues regularly tracked by Rasmussen Reports, while the environment and the war in Afghanistan are among the issues voters regard as least important.  Interestingly, though, for the first time in nearly two years, over half (51%) of voters say the immigration issue is Very Important in terms of how they will vote in the next congressional election.

Republican Senator Marco Rubio has been the most visible advocate of the Senate immigration proposal. Over the past several months, Rubio’s favorability ratings among Republican voters have fallen dramatically.

Despite continuing high consumer and investor confidence in the economy, the president’s job approval ratings also remain down from the highs he hit just after Election Day.

Confidence in the short-term housing market remains higher than it has been in several years, although just 50% now say their home is worth more than they still owe on their mortgage. That’s down 10 points from 60% in May. 

Most homeowners (56%) still say their home is worth more than when they bought it, but one-in-four now say it’s worth less.

In other news this week:

– Thirty percent (30%) of Likely U.S. Voters now say the country is heading in the right direction.

– Support remains high for allowing Americans to choose the level of health insurance they want based on how it impacts their pocketbooks.

– Americans still don’t have much confidence in the Federal Reserve Board to keep inflation under control.

– Americans have surprisingly similar overall views of their relationship with their spouse and relations with their co-workers.


Get every new post delivered to your Inbox.

Join 127 other followers