Skip to content

May 25 2013

May 28, 2013

25May2013

Newswire

Also posted weekly online as a blog at https://newswirefeed.wordpress.com/

 

Chinese back hacking U.S. businesses, government

Hackers working for a cyberunit of China’s People’s Liberation Army appear to have resumed their attacks using different techniques, computer-security experts and U.S. officials say.

By DAVID E. SANGERNICOLE PERLROTH

The New York Times

May 19, 2013

Related

 

WASHINGTON — Three months after hackers working for a cyberunit of China’s People’s Liberation Army went silent amid evidence that they had stolen data from scores of U.S. companies and government agencies, they appear to have resumed their attacks using different techniques, according to computer-industry security experts and U.S. officials.

The Obama administration had bet that “naming and shaming” the groups, first in industry reports and then in the Pentagon’s own detailed survey of Chinese military capabilities, might prompt China’s new leadership to crack down on the military’s highly organized team of hackers — or at least urge them to become more subtle.

But Unit 61398, whose well-guarded 12-story white headquarters on the edges of Shanghai became the symbol of Chinese cyberpower, is back in business, according to U.S. officials and security companies.

It is not clear precisely who has been affected by the latest attacks. Mandiant, a private security company that helps companies and government agencies defend themselves from hackers, said the attacks had resumed but would not identify the targets, citing agreements with its clients.

It did say the victims were many of the same ones the unit had attacked before.

The hackers were behind scores of thefts of intellectual property and government documents over the past five years, according to a report by Mandiant in February that was confirmed by U.S. officials. They have stolen product blueprints, manufacturing plans, clinical-trial results, pricing documents, negotiation strategies and other proprietary information from more than 100 of Mandiant’s clients, predominantly in the United States.

In interviews, Obama administration officials said the resumption of the hacking activity did not surprise them. One senior official said Friday that “this is something we are going to have to come back at time and again with the Chinese leadership,” who, he said, “have to be convinced there is a real cost to this kind of activity.”

Mandiant said the hackers had stopped their attacks after they were exposed in February and removed their spying tools from the organizations they had infiltrated. But during the past two months, they have gradually begun attacking the same victims from new servers and have reinserted many of the tools that enable them to seek out data without detection.

They are now operating at 60 to 70 percent of the level they were working at before, according to a study by Mandiant requested by The New York Times.

The Times hired Mandiant to investigate an attack that originated in China on its news operations last fall. Mandiant is not now working for the company.

Mandiant’s findings match those of Crowdstrike, another security company that has also been tracking the group. Adam Meyers, director of intelligence at Crowdstrike, said that apart from a few minor changes in tactics, it was “business as usual” for the Chinese hackers.

The subject of Chinese attacks is expected to be a central issue in an upcoming visit to China by President Obama’s national security adviser, Thomas Donilon, who has said that dealing with China’s actions in cyberspace is moving to the center of the complex security and economic relationship between the two countries.

But hopes for progress on the issue are limited. When the Pentagon released its report this month officially identifying the Chinese military as the source of years of attacks, the Chinese Foreign Ministry denied the accusation, and People’s Daily, which reflects the views of the Communist Party, called the U.S. “the real ‘hacking empire,'” saying it “has continued to strengthen its network tools for political subversion against other countries.”

In a report to be issued Wednesday, a private task force led by Obama’s former director of national intelligence, Dennis Blair, and his former ambassador to China, Jon Huntsman Jr., lays out a series of proposed executive actions and congressional legislation intended to raise the stakes for China.

“Jawboning alone won’t work,” Blair said Saturday. “Something has to change China’s calculus.”

The exposure of Unit 61398’s actions, which have long been well-known to U.S. intelligence agencies, did not accomplish that task.

One day after Mandiant and the U.S. government revealed the PLA unit as the culprit behind hundreds of attacks on agencies and companies, the unit began a haphazard cleanup operation, Mandiant said.

Attack tools were unplugged from victims’ systems. Command and control servers went silent. And of the 3,000 technical indicators Mandiant identified in its initial report, only a sliver kept operating.

Some of the unit’s most visible operatives, hackers with names like “DOTA,” “SuperHard” and “UglyGorilla,” disappeared, as cybersleuths scoured the Internet for clues to their real identities.

In the case of UglyGorilla, Web sleuths found digital evidence that linked him to a Chinese national named Wang Dong, who kept a blog about his experience as a PLA hacker from 2006 to 2009, in which he lamented his low pay, long hours and instant ramen meals.

But in the weeks that followed, the group picked up where it had left off. From its Shanghai headquarters, the unit’s hackers set up new beachheads from compromised computers all over the world, many of them small Internet service providers and mom-and-pop shops whose owners do not realize that by failing to rigorously apply software patches for known threats, they are enabling state-sponsored espionage.

“They dialed it back for a little while, though other groups that also wear uniforms didn’t even bother to do that,” Kevin Mandia, the chief executive of Mandiant, said Friday. “I think you have to view this as the new normal.”

The hackers now use the same malicious software they used to break into the same organizations in the past, only with minor modifications to the code.

While U.S. officials and corporate executives say they are trying to persuade President Xi Jinping’s government that a pattern of theft by the PLA will damage China’s growth prospects — and the willingness of companies to invest in China — their longer-term concern is that China may be trying to establish a new set of rules for Internet commerce, with more censorship and fewer penalties for the theft of intellectual property.

 

 

U.S. approves Apple iOS devices for use on Defense Department networks

Devices from BlackBerry and Samsung Electronics were cleared earlier by the department

By John Ribeiro

May 20, 2013 12:46 AM ET

IDG News Service – Devices built around Apple’s iOS operating system have been approved by the U.S. Department of Defense for use on its networks, as the department moves to support multivendor mobile devices and operating systems.

The Defense Information Systems Agency (DISA), which certifies commercial technology for defense use, said Friday it had approved the Apple iOS 6 Security Technical Implementation Guide (STIG).

“Approval of the STIG means that government-issued iOS 6 mobile devices are approved for use when connecting to DOD networks within current mobility pilots or the future mobile device management framework,” the agency said in a statement.

The department earlier this month cleared BlackBerry 10 smartphones and PlayBook tablets with its enterprise mobility management platform BlackBerry Enterprise Service 10 to be used on its networks. It also approved Samsung Electronics’ Knox, a new Android-based platform designed by the company to enhance security of the current open source Android.

The DOD mobility strategy includes mobile devices configured to the STIG, in combination with an actively managed and defended Mobility Device Management system, DISA said. The agency is responsible for establishing MDM, which provides a process for managing and distributing mobile applications and an enhanced cyberdefense infrastructure. DISA is running a pilot program to bring all the pieces together.

A DOD spokesman, Lt. Col. Damien Pickart, said earlier that a decision on Apple’s iOS was expected to be taken last week. Several mobile devices and operating systems are currently going through DISA’s STIG review and approval process, Pickart said via email earlier this month.

“We look forward to additional vendors also participating in this process, further enabling a diversity of mobile devices for use within the department,” Pickart said. The approvals do not result in product orders, he added. Actual orders will be linked to operational requirements and availability of funding with user organizations, DISA said in its statement.

DOD currently has more than 600,000 commercial mobile devices in operational and pilot use, including about 470,000 BlackBerry devices, 41,000 running Apple operating systems and 8,700 on Android. A Commercial Mobile Device Implementation Plan aims to permit use of the latest commercial technologies such as smart phones and tablets, and to develop an enterprise mobile device management capability and application store to support approximately 100,000 multivendor devices by February 2014, DISA said.

 

DoD Examines 3 Budget-Cut Scenarios

Defense News

May. 19, 2013 – 06:00AM |

By MARCUS WEISGERBER and VAGO MURADIAN     

WASHINGTON — Senior US Defense Department officials are expected to present three budget-cutting scenarios to the defense secretary when they wrap up a wide-ranging review of military strategy at the end of this month, according to sources.

These officials, part of the Strategic Choices and Management Review (SCMR), are preparing for a range of budget cuts — $100 billion, $300 billion and $500 billion — sources said.

Meanwhile, four Washington think tanks are preparing their own reviews of what they would cut and will present their findings May 29 on Capitol Hill.

Throughout the two-month review, DoD officials have agreed that closing bases and changing military compensation are critical components to achieving these targets, sources said. But these officials are pessimistic that Congress will sign off on such institutional changes, meaning force structure, acquisition programs and mission areas would bear the brunt of spending cuts.

The SCMR will not make any specific decisions, but it will frame the choices DoD must make depending on the level of funding Congress appropriates. It also will present options for how to cut. Sources caution against expecting a fully revised strategy when the review is presented May 31.

“The value of strategy in this budget environment gives you the basis to make tradeoffs or priorities,” said David Berteau of the Center for Strategic and International Studies (CSIS). “You can’t make tradeoffs and adjudicate priorities without working the numbers, so they’re applying the right prism because choices only become real with resources attached to them. The review is designed to set the stage for the next phase of the discussion, which is the [Quadrennial Defense Review] and in parallel the resolution of the ’14 budget and the development of the ’15 budget.”

The first SCMR — known as the “skimmer” or “scammer” in defense circles — assumes the White House’s fiscal 2014 budget proposal is adopted. That plan calls for $100 billion in cuts, but those cuts are back-loaded in the final five years of the decade-long plan. Pentagon officials favor this option because it gives them time to plan and gradually implement the cuts.

Sources say the Army would bear the brunt of those cuts, mostly to force structure. Still, Pentagon officials believe a greater spending cut will be levied on DoD.

If the second option — the $300 billion cut — were put in place, the cuts would be levied against all the services.

The third option assumes full sequestration, or $500 billion over the decade. Sources with insight into the SCMR say this option would wreak the most havoc on the military and force the cancellation or scaling back of several major acquisition efforts.

These sources also said the magnitude of the cut could prevent the military from being able to fight a major war against a near peer competitor.

The Pentagon’s January 2012 military strategy calls for focusing on the Asia-Pacific and Middle East. For more than a year, defense leaders have said that sequestration, which amounts to about a $50 billion cut each year starting this year, would destroy their strategy.

Still, some Pentagon officials said that strategy has some wiggle room to absorb spending reductions.

Deputy Defense Secretary Ashton Carter is running the SCMR project along with Army Gen. Martin Dempsey, chairman of the Joint Chiefs of Staff. Comptroller Robert Hale and Christine Fox, the director of DoD’s Cost Assessment and Program Evaluation office, are also playing key roles in the review. Fox plans to step down in June following the completion of the SCMR.

At the same time, each of the services’ Quadrennial Defense Review offices is contributing to the review.

The goal of the project is to provide President Barack Obama and Defense Secretary Chuck Hagel “decision points,” which will inform DoD’s fiscal 2015 budget submission, and execution of the 2014 budget, Carter said. It will also guide the 2014 Quadrennial Defense Review.

“The review will define the major strategic choices and institutional challenges affecting the defense posture in the decade ahead, but it must be made to preserve and adapt our defense strategy and the department’s management under a wide range of future circumstances that could result from a comprehensive deficit reduction deal or the persistence of the cuts that began with this year’s sequester,” Carter said at the National Press Club on May 7.

“Everything will be on the table during this review: roles and missions, war planning, business practices, force structure, personnel and compensation, acquisition and modernization investments, and how we operate and how we measure and maintain readiness,” he said.

The review will produce “a clear delineation of the choices that we can make and might have to make” in areas such as force structure, investments, compensation, health care or even headquarters support,” Carter said.

Regardless of the budget situation, the Pentagon has already launched an effort to reduce headquarters staff sizes.

Hagel has targeted so-called overhead spending as a key area for cutting. Last week, a Government Accountability Office report found headquarters staffs at geographical combatant commands have increased by 50 percent to about 10,100 people. Mission and headquarters support costs at these commands is about $1.1 billion each year.

“If you’re really serious about the back office and you’re really serious about bringing down the force structure, you don’t need to terminate all the acquisition programs,” said Gordon Adams, an analyst with the Stimson Center who oversaw defense budgets during the Clinton administration.

 

Shadow Reviews Underway

As the SCMR wraps up, four Washington think tanks are conducting their own shadow reviews of areas to trim the Pentagon budget.

The Center for Strategic and Budgetary Assessments (CSBA), CSIS, American Enterprise Institute and Center for New American Security are scheduled to present their findings at a May 29 event on Capitol Hill.

Each think tank will come up with recommendations for how they would modify the military strategies and procurement profiles if sequestration budget cuts stick over the decade, said Todd Harrison, an analyst with CSBA, which organized the project.

“You get four independent solutions from the four think tanks,” he said.

While the four organizations have different political leanings, one goal will be to show common areas cut throughout the exercise.

The SCMR is supposed to wrap up by May 31 and remains on schedule, a defense official said.

 

Exxon: No Plans Yet To Reopen Ruptured Pipeline, and No Answers Why

The company has yet to release results of a sophisticated test of the 65-year-old pipeline’s interior, conducted in February.

By David Hasemyer, InsideClimate News    

May 20, 2013

 

Almost two months after a ruptured pipeline sent at least 210,000 gallons of oil flowing through a neighborhood in Mayflower, Ark., the line’s owner—oil giant ExxonMobil—remains largely silent on the future of its failed pipeline.

Most of the visible oil has been removed from the neighborhood and the ruptured section of pipe has been replaced and reburied. Yet Exxon hasn’t asked the U.S. Pipeline and Hazardous Materials Safety Administration (PHMSA) for permission to restart the 850-mile Pegasus line, which runs across four states from Patoka, Ill. to Nederland, Texas.

A company spokesman said Exxon is simply being thorough and cautious.

“This pipeline will not be restarted until we are convinced it is safe to do so,” said Aaron Stryk. “We need to identify the cause of the incident and the mitigation steps necessary to prevent an incident like this from occurring again.”

Some industry analysts say there could also be other reasons for the delay. The 65-year-old Pegasus line could be riddled with defects and require extensive repairs. Or perhaps Exxon is considering other alternatives, including replacing the Pegasus with a larger line.

The Pegasus pipeline is one of the few pipelines currently supplying a heavy Canadian crude oil known as diluted bitumen, or dilbit, to refineries on the Texas Gulf Coast. But at 20 inches in diameter, it can carry only about 95,000 barrels or 3.9 million gallons per day. That’s roughly a tenth as much as the 36-inch Keystone XL pipeline would carry to the coast if it is approved. The Keystone also would carry dilbit.

In order to reopen the Pegasus pipeline, Exxon must comply with the corrective action order PHMSA issued two days after the March 29 spill. The order lists a dozen conditions the company must meet, including tests on the section of pipe that burst and a comprehensive safety evaluation of the entire pipeline.

On May 2, Exxon appealed four of these conditions, including PHMSA’s stipulation that it operate the line at a lower pressure when it is restarted. The appeal did not challenge PHMSA’s pipeline integrity requirements or mention the company’s plans for the line.

On May 10, PHMSA granted two of those requests but rejected the other two, including the request for a higher pressure if the line is restarted.

Peter Howard, president of the Canadian Energy Research Institute, a nonprofit energy and environmental research organization, said the lengthy shutdown is unusual.

After a 41-year-old pipeline owned by Enbridge Inc. spilled more than a million gallons of dilbit into Michigan’s Kalamazoo River in 2010, Enbridge filed for permission to restart the line 13 days later. That request was denied and the line was restarted almost two months after the accident, which was the largest inland oil spill in U.S. history and is still being cleaned up today.

Two years later, another Enbridge pipeline burst in Wisconsin, spewing 50,000 gallons of dilbit into a pasture. Enbridge got permission to restart that line eight days later.

Howard thinks Exxon’s caution may be part of the company’s business approach.

“To them, the most embarrassing thing would be to restart it and come across another leak,” he said. “I think you’ll find that in all cases, especially with older lines, they’ll take the opportunity to inspect them inside and out.”

Carl Weimer, executive director of the Pipeline Safety Trust, a nonprofit watchdog organization based in Bellingham, Wash., said clues about why Exxon isn’t pushing harder to get the line reopened might be found in tests Exxon conducted on the line in February, two months before the rupture.

Exxon used a sophisticated monitoring device known as a pig to inspect the interior of the pipeline along the section that ruptured. The test is designed to detect cracks or flaws.

“The answers could be in those findings,” Weimer said. “Generally what you would expect to get out of that is whether the rupture was an anomaly that can be fixed or does the test show that the whole pipeline has similar problems that need to be addressed.”

Stryk, the Exxon spokesman, said data from the pigging is still being evaluated and that the results will be shared with PHMSA “in the coming months.”

Because the Pegasus is a small pipeline, the consequences of an extended shutdown aren’t too dramatic, said Martin Tallet, president of EnSys Energy, a business management consulting company that prepared a report on the Keystone XL’s impact on oil markets for the U.S. Department of Energy.

Nevertheless, he said, the oil that normally would flow though the Pegasus pipeline isn’t getting to the refineries.

“That brings up the question of the flexibility for the refinery to find an alternative to the crude or absorb the loss,” Tallet said Stryk wouldn’t disclose the names of the companies that receive oil from the Pegasus or discuss the financial impacts of the line’s closure.

John Stoody, director of government and public relations for the Association of Oil Pipe Lines, a Washington D.C.-based industry organization, said the closure of any oil pipeline has some kind of effect.

“Every pipeline, every amount of oil transported is accounted for in terms of meeting the energy needs of the country,” Stoody said. “So no matter the size of the pipeline delivering oil to the refineries, that oil is part of a pretty precise calculation designed to meet those energy needs.”

Bruce Bullock, director of the Maguire Energy Institute at Southern Methodist University Cox School of Business in Dallas, said Exxon is financially stable enough that it doesn’t need to rush the Pegasus pipeline back into operation.

“They have the wherewithal to be able to step back and make informed decisions on how best to move forward,” Bullock said.

That includes everything from considering the possibility of replacing the pipeline to improving the existing line.

As part of that assessment, Bullock expects Exxon executives to evaluate the demand for Canadian tars oil on the Gulf Coast and consider how massive projects like TransCanada’s Keystone XL pipeline and Enbridge’s pipeline expansion plans might affect business on the Pegasus.

“I think there’s going to a lot of wait and see,” he said.

Exxon is a major oil sands producer as well as a pipeline operator. At its Cold Lake oil field in Alberta alone, Exxon extracts 123,000 barrels of bitumen a day from 4,000 wells using steam injection, according to the company’s 2012 Operating and Financial Review. At another of its Alberta sites, Exxon estimates 4 billion barrels of bitumen is available for extraction.

 

The March 29 spill in Arkansas forced the evacuation of 83 people from almost two dozen homes and raised health concerns for those exposed to fumes from the oil. Exxon is maintaining a high profile in the community, replacing damaged landscaping, cleaning pollution from a secluded section of a shoreline and making buy-out offers to affected property owners.

 

Exxon put the cost of the spill at $16.4 million in an April 26 PHMSA accident report.

The northern 648-mile section of the pipeline, which includes the portion that burst, is 65 years old and is buried an average of 24 inches below ground. An examination of the failed section showed a split 22 feet long and 2 inches wide that allowed the oil to spew out under high pressure. The southern section of the line is 59 years old.

Exxon has said it shut down the pipeline within 16 minutes of discovering a pressure drop on the line, but enough oil spilled over the next three hours to affect aquatic animals and wildlife; contaminate the soil, coat vegetation and taint surface water, according to the April 26 accident report.

The report said an estimated 5,000 barrels—210,000 gallons, or enough to fill about a third of an Olympic-sized swimming pool—of heavy crude oil poured from the ruptured pipeline. Of that, 2,000 barrels—84,000 gallons—had been cleaned up by April 26. The report also noted that 2,000 barrels of the oil had fouled drainage ditches and a cove south of Lake Conway, a popular recreation area renowned for its fishing and scenic setting.

 

Extreme global warming seen further away than previously thought

By Environment Correspondent Alister Doyle

OSLO | Sun May 19, 2013 6:06pm EDT

(Reuters) – Extreme global warming is less likely in coming decades after a slowdown in the pace of temperature rises so far this century, an international team of scientists said on Sunday.

Warming is still on track, however, to breach a goal set by governments around the world of limiting the increase in temperatures to below 2 degrees Celsius (3.6 Fahrenheit) above pre-industrial times, unless tough action is taken to limit rising greenhouse gas emissions.

“The most extreme rates of warming simulated by the current generation of climate models over 50- to 100-year timescales are looking less likely,” the University of Oxford wrote about the findings in the journal Nature Geoscience.

The rate of global warming has slowed after strong rises in the 1980s and 1990s, even though all the 10 warmest years since reliable records began in the 1850s have been since 1998.

The slowdown has been a puzzle because emissions of heat-trapping greenhouse gases have continued to rise, led by strong industrial growth in China.

Examining recent temperatures, the experts said that a doubling of carbon dioxide concentrations in the atmosphere above pre-industrial times – possible by mid-century on current trends – would push up temperatures by between 0.9 and 2.0 degrees Celsius (1.6 and 3.6F).

That is below estimates made by the U.N. panel of climate scientists in 2007, of a rise of between 1 and 3 degrees Celsius (1.8-5.4F) as the immediate response to a doubling of carbon concentrations, known as the transient climate response.

 

OCEANS

The U.N. panel also estimated that a doubling of carbon dioxide, after accounting for melting of ice and absorption by the oceans that it would cause over hundreds of years, would eventually lead to a temperature rise of between 2 and 4.5 C (3.6-8.1F).

Findings in the new study, by experts in Britain, the United States, Canada, Australia, France, Germany, Switzerland and Norway, broadly matched that range for the long-term response.

But for government policy makers “the transient response over the next 50-100 years is what matters,” lead author Alexander Otto of Oxford University said in a statement.

The oceans appear to be taking up more heat in recent years, masking a build-up of carbon dioxide in the atmosphere that passed 400 parts per million this month for the first time in human history, up 40 percent from pre-industrial levels.

Professor Reto Knutti of ETH Zurich, one of the authors, said that the lower numbers for coming decades were welcome.

But “we are still looking at warming well over the two degree goal that countries have agreed upon if current emission trends continue,” he said.

Temperatures have already risen by about 0.8 Celsius (1.4F) since the Industrial Revolution and two degrees C is widely viewed as a threshold to dangerous changes such as more floods, heatwaves and rising sea levels.

“The oceans are sequestering heat more rapidly than expected over the last decade,” said Professor Steven Sherwood of the University of New South Wales in Australia, who was not involved in the study.

“By assuming that this behavior will continue, (the scientists) calculate that the climate will warm about 20 percent more slowly than previously expected, although over the long term it may be just as bad, since eventually the ocean will stop taking up heat.”

He said findings “need to be taken with a large grain of salt” because of uncertainties about the oceans.

 

F.A.A.’s Concerns Hold Up Use of Wildfire Drones

NY Times

By FELICITY BARRINGER

Published: May 21, 2013

SAN FRANCISCO — As wildfire season begins in Western landscapes that were covered in smoky haze for weeks at a time last summer, the federal government’s firefighters are exploring the use of small remote-controlled drones with infrared cameras that could map a fire’s size and speed, and identify hot spots, a particular danger.

With a maximum wingspan of about 52 inches, the drones would supplement and perhaps replace manned surveillance aircraft, potentially reducing the risk to both pilots and firefighters.

But the effort is being slowed by Federal Aviation Administration regulations.

The use of drones in open airspace is regulated by the F.A.A., and its safety requirements effectively preclude unmanned aerial systems, or U.A.S.’s, from operating out of sight of a ground-based pilot. If distance or the smoke of a wildfire obscures a drone from observers on the ground, a piloted aircraft must be sent aloft to keep an eye on it.

“In terms of federal regulations right now, we can’t use U.A.S.’s out there except on a very limited basis,” said Ron Hanks, the aviation safety and training officer at the federal Forest Service.

Rusty Warbis, the flight operations manager at the Bureau of Land Management, said the process of approving individual trial flights was “cumbersome,” though improving.

The evaluations by wildfire experts are part of larger questions on how to incorporate these aircraft, originally used for military purposes, into civilian missions. The drones could complicate the main mission of the F.A.A., ensuring the safety of the country’s airspace. And observers in Congress believe that inherent distrust of government and privacy concerns are also slowing the introduction of firefighting drones.

Their potential usefulness, particularly their ability to pinpoint hot spots and fly in thick smoke that would ground other aircraft, was shown in an Alaskan fire nearly four years ago.

The fire, which burned over 447,000 acres — roughly half the size of Rhode Island — northeast of Fairbanks, was generating so much smoke that no planes were permitted to fly overhead. But a drone belonging to the University of Alaska Fairbanks was launched and easily identified the extent of the blaze and its varying levels of heat.

“The smoke was so thick no one was flying — that’s why they came to us,” said Rosanne Bailey, a retired Air Force brigadier general who is the deputy director of the Alaska Center for Unmanned Aircraft Systems Integration at the university. “We could fly and see the borders of the fire using infrared.”

Kent Slaughter, the acting manager of the Bureau of Land Management’s Alaska Fire Service, said it took four days to get the F.A.A.’s approval for that flight in 2009; the process is now down to about 24 hours.

But privacy concerns are slowing the integration of unmanned vehicles into the firefighters’ tool kit, said Senator Mark Begich, a freshman Democrat from Alaska. “Firefighting is a great example of how unmanned aircraft” are able “to determine the range of a fire, the intensity of a fire, without jeopardizing lives,” he said. “That’s a unique application, especially in my state, in Colorado, in California.”

He called the delays in getting approvals for testing the craft “frustrating.” The reason cited most often by firefighting experts is the requirement that the aircraft be followed and monitored by a chase plane if ground observers cannot see them through smoke, or because they are flying into canyons in steep and rugged terrain.

Les Dorr, an F.A.A. spokesman, said that safety in the air and on the ground is paramount and that the issue of line-of-sight requirements for drone use was being carefully studied.

The Army has lent the Interior Department 41 small drone aircraft that have been used for environmental monitoring, including tracking migratory wildfowl.

The Forest Service, part of the Department of Agriculture, has also been studying drone use for years. Mr. Hanks, of the Bureau of Land Management, said one question was how much value drones would bring to existing firefighting methods.

“We are still developing policies internally, what the cost benefit would be,” he said. The drones, Mr. Hanks added, “would be competing against what we could do aerially against a helicopter or a light fixed-wing airplane.”

John Gould, the aviation chief at the B.L.M., who along with Mr. Hanks is based at the National Interagency Fire Center in Boise, Idaho, had a similarly cautious perspective. “We’re trying to get them in the mix and put them out in the field to see the potential,” he said.

 

CIO.com

 

U.S. Power Companies Under Frequent Cyberattack

Legislation that would give the federal government power to oversee the protection of utilities has stalled

By Jeremy Kirk

Tue, May 21, 2013

IDG News Service — A survey of U.S. utilities shows many are facing frequent cyberattacks that could threaten a highly interdependent power grid supplying more than 300 million people, according to a congressional report.

More than a dozen utilities said cyberattacks were daily or constant, according to the survey, commissioned by U.S. Democratic Representatives Edward J. Markey and Henry A. Waxman. The 35-page report on the survey, called “Electric Grid Vulnerability,” was released on Tuesday.

The report is in response to widespread concerns that hackers could damage parts of the U.S. power grid, causing widespread outages and prolonged economic effects. Markey and Waxman are members of the U.S. House Energy and Commerce Subcommittee, which held a hearing on cyberthreats and security on Tuesday.

Power outages and quality disturbances cost the U.S. economy upwards of US$188 billion annually, with single outages costing as much as $10 billion, the report said. Replacing large transformers, for example, can take more than 20 months.

The 15-question survey was sent to more than 150 utilities owned by investors, municipalities, rural electric cooperatives and those that are part of federal government entities. About 112 responded to the survey, which was sent in January.

Many utilities were coy in their responses. None reported damage as a the result of cyberattacks, and many declined to answer the question of how many attempted attacks were detected, the report said

One utility said it recorded 10,000 cyberattacks per month, while another said it saw daily probes for vulnerabilities in its systems and applications. Cyberattacks are inexpensive to execute and hard to trace, the report said.

“It has been reported that actors based in China, Russia, and Iran have conducted cyber probes of U.S. grid systems, and that cyberattacks have been conducted against critical infrastructure in other countries,” the report said.

The U.S. Congress has not delegated oversight of utilities’ cybersecurity to a federal agency. An industry organization, the North American Electric Reliability Corporation (NERC) publishes both mandatory and voluntary security standards, the report said.

In 2010, the U.S. House of Representatives passed the GRID Act, which would have given the Federal Energy Regulatory Commission the authority to protect the electricity grid. But the legislation did not pass the Senate, and the issue remains inactive in the House, the report said.

 

It’s More About “Wide Data” Than Big Data

By By Gavin Kaimowitz, Simon Lyon and Geoff Cole, Sapient Global Markets@wallstreettech

http://www.wallstreetandtech.com/data-management/its-more-about-wide-data-than-big-data/240155306

 

Financial services organizations know that they can handle vast amounts of data, since they have been doing it for decades. To succeed with big data initiatives, firms need to think of data’s breadth and reach across an organization.

 

May 21, 2013

Rather than thinking about the physical size of data, consider the importance of being able to relate different sets of data. For financial services firms, the complexity isn’t found in the need to store more data; it is in how best to link, arbitrate and cleanse all of the data required to support regulatory needs. So, it becomes less about how big the data is and more about its breadth and reach across the organization. The new challenge in the next few years will revolve around managing what could be called Wide Data.

Regulations Influencing the Wide Data Trend
In the next two years, regulations will be enforced across a variety of operational and technical areas. The following regulatory requirements highlight the need for investment into strategic data programs:

Data Retention, Search and Retrieval: The data retention business conduct rules as part of the Dodd-Frank Act require registered Swap Dealers (SDs) and Major Swap Participants (MSPs) to record, store and reconstitute all information related to a trade within 48 hours upon request from the Commodity Futures Trading Commission (CFTC). It is expected that firms will be able to provide complete audit information of a trade, related telephone conversations, analyst reports, sales and marketing documentation, emails, instant messages and any other pertinent information that aided in the investment decision — including all pre-trade activity (quotes, bids, offers, solicitations) relating to the inception of a deal.

Impact: Firms need a mechanism to correlate disparate transaction records with records of communications — either through automated or manual processes.

OTC Swap & Derivative Counterparty Identification: Global trade repository reporting requirements mandate using standard counterparty identifiers in order to provide regulators with the ability to aggregate and identify potential risk areas. In the absence of a global legal entity identifier (LEI) as proposed by the G20 Financial Stability Board (FSB), the CFTC has required the use of interim identifiers. In other regulatory regimes, the lack of a global standard may provide greater tolerance and allow the use of existing standard identifiers, like SWIFT BIC codes, as the best available and easiest options for firms to achieve compliance.

Impact: Firms need to make “best efforts” to clean counterparty reference data to support immediate regulatory requirements while maintaining flexibility to transition to a global standard in the near future.

Data Transparency: The Dodd-Frank Act will be followed by other global regulations that require the disclosure of unprecedented levels of detail for each and every derivative transaction. Public dissemination of trade execution data in near real-time will provide all market participants with indications of liquidity and levels of activity. Additionally, the reporting of full primary economic terms and valuations data will give regulators the ability to calculate risk and police markets with greater levels of scrutiny in attempts to mitigate systemic risk.

Impact: Firms must enhance trade capture from both a timeliness and completeness perspective and continually enrich data as it flows through to confirmation and settlement systems. This increases the reliance on data quality, especially as it pertains to counterparty information.

Data Lineage: Perhaps the most critical data element of trade repository reporting requirements is the use of a Unique Swap Identifier (USI) or Unique Transaction Identifier (UTI). The transaction identifier must be shared with the counterparty at or near the point of execution and is of critical importance for regulators seeking to understand “the story” of a derivative transaction over time. Additional requirements necessitate the linking of transactions by USI in order to indicate how risk may have been transferred through novations and other post-trade life cycle events. Impact: Booking processes and trade capture systems must evolve to support a range of new transaction identifiers with increased functionality to support transaction linkage and life cycle event traceability.

Foreign Account Tax Compliance Act (FATCA): Enacted in 2010, the Foreign Account Tax Compliance Act (FATCA) was introduced to combat tax evasion by US persons through the use of offshore vehicles. Passed as part of the Hiring Incentives to Restore Employment (HIRE) Act in 2012, the final regulations were published at the end of January 2013. FATCA requires Foreign Financial Institutions (FFIs) to identify US accounts and report certain information about those accounts to the US Internal Revenue Service (IRS) annually. Non-compliance results in 30% withholding tax on US clients of FFIs. A number of Inter-Governmental Agreements (IGA) have been signed with the IRS including a reciprocal agreement with the UK. IGAs may provide a slightly less onerous approach than full IRS registration.

Impact: Client data needs to be correlated and cleansed to ensure that the holistic view of a client can be analyzed to determine if the client is subject to the FATCA ruling.

Achieving Control through Data Governance
In 2012, the major investment focus within data management was around Legal Entity data. This was primarily due to the regulations concerning the reporting of OTC derivative trades and FATCA. In both of these situations, investment firms are required to provide a single consolidated view of their customers and to disclose their counterparty. While this might seem like a very simple exercise, the complexity of creating a single customer view is both technically and politically challenging.

Within multinational banks that run retail and Investment Banking (IB) divisions, the physical technology infrastructure is commonly split by region and business line. Data is commonly duplicated and is often inconsistent. This is not necessarily an IT infrastructure problem, but is likely the result of the manner in which a business operates or has grown (organically or with acquisitions).

In order to break down these barriers, more firms are changing their organizational structures by introducing a Chief Data Officer (CDO). The role of the CDO is to ensure that data is handled consistently throughout the organization — and with standard policies and measures.

Everything Relates

The Data Retention and Data Lineage regulations are causing widespread debates this year between IT organizations and the business as to how to implement solutions that balance the need to provide an immediate solution to regulatory requirements and to build a strategic solution that will sustain the business over the next ten years. The main issue seldom relates to technology, but rather to the ability to correlate data from disparate data sources and to intelligently manage the bi-temporal aspects of data. As a recent example, the Libor probe is forcing all firms who are being investigated to look through several years’ worth of information — resulting in a massive manual effort to correlate the findings.

All processes within financial institutions track different data points. These data points will all eventually relate to one another if all of the correlation data points are captured. However, it is not possible to know now what the required correlation data points will eventually be, as new data types are constantly being introduced.

So when firms begin to map out the ontology to represent WideData within financial services, the data facts can be seen to have the potential to map to a variety of different data types. This includes newer data types that aren’t typically linked, like the internet and unstructured data.

Next Generation Data

As the breadth of related data relationships increases and new types of datasets are made available, new correlations will be possible. The trend of new data in capital markets includes the recent mandated regulation to provide unique Legal Entity Identifiers (LEIs), Unique Transaction Identifiers (UTIs), Unique Product Identifiers (UPI)s and other similar identifiers. As new symbologies are implemented, more accurate correlations can be formed. These new correlation points yield a new dimension of possibilities.

Structured and Unstructured

Banks are typically also faced with the following two types of data that are required to be correlated and have data governance oversight throughout the record life cycle:

Structured: Data that is physically placed in a format that is easy to access through technology and where the relationships between the data are defined. Technologies that are used to manage structured data include relational databases, XML and other structured flat file formats.

Unstructured: Typically in the form of free text-based artifacts, such as documents (e.g., Investment Research, ISDA Master/CSA, Prospectuses, Legal Contracts), telephone conversations, email chains, logs, video, pictures and other similar items. For each of these types of data and based on the desired usage, there are appropriate technologies that can aid in the correlation of data. Regardless of the technology chosen, the ability to make correlations between these two data types is crucially important to meet regulatory demands. Historically, the management of data that is not stored in a traditional relational data store has been complicated. However, recent advances in technologies now allow for an array of products to provide scalable and reliable solutions. These products are typically implemented as an ontology, semantic and natural language processing (NLP) solution. This allows the data to remain close to the native format in which it is found and for these types of products to be used as an interpretation and rich data mining framework.

Competitive Advantage of Wide Data

The battle of Agincourt is renowned for the triumph of the 6,000 English over 36,000 French. With just one sixth the number of troops, the English won the battle because they had access to information that the French did not have. For example, the English knew that the ground was saturated with water and they would get stuck if they marched through it with their heavy armor. This historical event proved that it only takes one small piece of information to gain a significant competitive advantage.

In more recent times, Michael Burry proved that he was able to predict the systemic crash of the mortgage market and to pick the best Mortgage Backed Securities (MBS) to hedge against. While the information he was using was freely available, his competitors were not paying attention to the correlation.

For centuries, information has been leveraged to yield an advantage. This still holds true today. In today’s economy, investment organizations are trying to gain an edge by knowing more than their competitors. Quants, for example, have been gaining a competitive advantage by seeking alpha based on highly confidential methodologies.

WideData offers a new way for firms to establish a competitive advantage and the market has already begun to take notice. This can be seen in the algorithmic trading industry with sentiment feeds, data providers offering supply chain information and data utilities providing transparency into loan-level characteristics. These new offerings are enabling firms to begin performing new types of analyses which can put them well ahead of the competition.

Planning for Constant Change

Regulations will continue to change, analytic requirements will evolve and business activities will advance to handle more complex products. The only way firms can prepare for this constant change is to strategically design and plan for a consistent way to manage Wide Data, enforce standard data governance techniques and ensure that the technical implementation of the data fabric in the organization is built intelligently to scale both wide and deep. After all, intelligence is more about synapses than neurons. If two organisms have the same number of neurons, the one with more synapses will always be more intelligent.

About The Authors:

Gavin Kaimowitz is a Director leading Sapient Global Markets’ Data Management Practice across the capital and commodity sectors in Europe. Gavin is responsible for the collation and generation of best practices, thought leadership and strategy. He has a proven track record in the reference and market data domain of solution design, business case definition, strategic roadmap design and in building data management products. gkaimowitz@sapient.com

 

Simon Lyon has worked in investment banking for 25 years and has driven the standard for client document management and the rules for account opening, KYC and AML. Simon specializes in structured and unstructured data and developed a research aggregation platform with Smartlogic (ontology and NLP) and MarkLogic (search and XML database). slyon@sapient.com

Geoff Cole is a Director within Sapient Global Markets’ Market Initiatives practice focusing on helping asset managers and investment banks shape operations and technology strategy, evolve product offerings, and respond to regulatory change. Geoff has recently supported several projects focused on developing strategies for implementing flexible enterprise data management solutions. gcole@sapient.com

Salon.com

Cyber attacks could cause the next world war

Officials say a “cyber-Pearl Harbor” would devastate our infrastructure, potentially resulting in mass casualties

 

By Jeb Boone

Wednesday, May 22, 2013 12:33 PM EDT

Global Post

 

ATLANTA, Georgia — It’s a new breed of warfare, unlike anything you’ve ever seen.

It can threaten a nation’s core security, cause mass casualties and weaken the economy, according to the Government Accountability Office, the US Congress’ research arm.

Assailants “could gain control of critical switches and derail passenger trains, or trains loaded with lethal chemicals. … They could contaminate the water supply in major cities,” then-US Defense Secretary Leon Panetta said last October. Foes could take down electric or water systems, fomenting public panic, reaping high death tolls and causing high physical and economic costs.

It might not even be immediately clear who was behind the attack, or where it was coming from. This devastating power could be wielded with comparatively few operatives, or without the support of a national government. And the massive kinetic strength of the US military would be essentially helpless in thwarting it.

For more than two decades, internet-based attacks have been relatively infrequent and mostly low level. Now, many experts caution that the specter of cataclysmic cyber war is upon us.

Not everyone agrees with the perilous scenarios; prominent dissenters contend that governments are peddling a trumped-up digital disaster threat to justify privacy intrusions.

But US officials use phrases like “cyber-Pearl Harbor” to describe the threat hackers pose to the critical infrastructure — electricity, water, trains, oil and gas pipelines — and the information networks that run the economy.

World governments have begun taking the threat of cyber war seriously. New specialized military units like the US Cyber Command, South Korea’s Cyber Warfare Command and NATO’s Computer Incident Response Capability have all begun preparing cyber soldiers.

Responding to the founding of US Cyber Command, China established the now infamous division of the People’s Liberation Army dedicated to “defense” against cyber threats.

 

Cyber war 101

One factor that sets cyber assault apart from other forms of warfare is the relative ease in launching it. Inducing a catastrophic infrastructure failure may only demand one small change in a line of code.

“We don’t even know if you have to have really good network intelligence, be sustainable in your attacks or have persistent access,” said Timothy Junio, a research fellow at Stanford University’s Center for International Security and Cooperation.

In a nightmare scenario cited by US President Barack Obama, trains carrying hazardous chemicals could derail, contaminating water supplies. Obama last year wrote an op-ed for the Wall Street Journal cautioning, “The lack of clean water or functioning hospitals could spark a public health emergency. And as we’ve seen in past blackouts, the loss of electricity can bring businesses, cities and entire regions to a standstill.”

Despite the magnitude of the threat, experts contend that the US is woefully under-protected.

They say computer systems that manage critical infrastructure are plagued by security vulnerabilities that would shock anyone with a rudimentary understanding of how to secure a personal computer, let alone a power grid.

A surprising number of systems use passwords hardcoded by the manufacturer — available to hackers via Google search. Other systems use unchanged default username and passwords like “admin/admin.”

If passwords aren’t publicly available, other glaring vulnerabilities often remain, such as systems “connected to the internet that shouldn’t be; people using a workstation that handles physical control at a plant to access their [email],” said Junio.

Users casually browsing the internet on infrastructure workstations need only download a malicious email attachment or click a single malicious link to compromise the security of an entire infrastructural system.

“All technical experts agree that critical infrastructure in the US is highly vulnerable,” said Junio. “I can’t think of any technical study where someone has done penetration testing against a critical infrastructure site and came back saying ‘yes this is fine.'”

Part of the reason why systems are so vulnerable is because they were created before widespread use of the internet, and were never designed to be secure in the first place.

“You’re taking a system that wasn’t meant to be available and now you’re making it available, everywhere,” said Kevin Albano, a manager of security firm Mandiant’s Threat Intelligence division.

Given that infrastructure systems are remarkably unguarded, the other major hurdle for cyber warriors to surmount is finding the right networks.

There are plenty of readily available tools that can help.

One of the most effective is called SHODAN. It’s available to anyone with web access. SHODAN is used by information security experts to assess whether networks are secure. Consequently, penetration testing tools can be used by hackers in security breaches.

“SHODAN is a search engine for machines connected to the internet. It could be anything from a webcam to a photocopier. It scours the internet looking for IP addresses associated with machines,” Junio said. “SHODAN enables hackers to look for targets worldwide, in an automated way, and it’s perfectly legal.”

Junio noted that during a trip to Taiwan, he discovered that more than 6,000 Taiwanese infrastructure control systems were found in SHODAN — without the government knowing this was a security problem.

In addition to SHODAN, one of the simplest, most common methods used by hackers to gain access to critical infrastructure is a spear phishing attack.

Spear phishing is often successful because it needs to fool only one employee to grant hackers access to an entire system. It works like this: Posing as a colleague, hackers send emails to employees of a utility, asking them to log in to a linked site, using their company username and password. When the unwitting employee logs in, the hacker harvests their password.

Because most people use only one or two passwords for all of their online or company accounts, a single password could give the hacker a way into a system controlling the utility.

Spear phishing attacks are incredibly difficult to defend against because they exploit the likelihood that at least one employee will be fooled.

 

Once inside, a small change in a system can potentially cause cascading failures, just as a bird can disrupt electricity for thousands by flying into a transformer.

That’s because infrastructure systems “are extremely intertwined,” said Robert Bea, risk assessment expert and professor at the University of California at Berkley. “Should one piece of a system fail, you end up with these cascades, sort of like a game of dominos.”

“It doesn’t take anything horribly catastrophic to initiate an infrastructure disaster. Using cyber attack methods, individuals with malicious intent could determine the most efficient way to trigger multiple infrastructure failures,” Bea added.

 

What would a cascading failure brought on by a cyber attack look like?

According to Bea, it would be very similar to those brought on by natural disasters. To estimate the damage that could be inflicted by a cyber attack triggering a cascading failure, look no further than New Orleans.

“The best reference for me will be Hurricane Katrina and the flood protection system for the Greater New Orleans Area … Katrina caused a cascade of infrastructure failures that affected the city for months, years. Some are still not working properly,” Bea said.

The catastrophic failure of the New Orleans flood levee led to the deaths of more than 1,500 people, in addition to untold billions in economic and environmental damage.

If the catastrophic failure had stemmed from a cyber attack on the systems that controlled the flood levees, the aftermath in New Orleans may have been similar to the failure caused by the hurricane.

While cyber security vulnerabilities in infrastructure systems may only be one problem among many concerning aging infrastructure — the power to unleash another Katrina may rest with hackers, state sponsored or independent, wielding powerful pieces of malware.

That might sound abstract, at least until a major assault occurs. But in 2012, US computers were the target of nearly 9 million malware attacks. And more recently, an attack in South Korea took banks down for days.

will always be more intelligent.

North Korea: How the least-wired country became a hacking superpower

North Korea’s cyber-war strength says a lot about the future of global hostilities.

http://www.globalpost.com/dispatch/news/regions/asia-pacific/north-korea/130517/north-korea-least-wired-hackers

Geoffrey Cain

May 22, 2013 06:01

 

SEOUL, South Korea — This year, North Korea has been flaunting its nuclear hardware in an effort to extort concessions from the United States and South Korea.

But the tactic has failed to provoke panic for one key reason: Officials doubt that Pyongyang would be stupid enough to start a nuclear war.

While nukes are better seen than used, and thus of limited blackmail value, dictator Kim Jong Un possesses a quieter weapon that’s more readily unleashed — and has already become a serious nuisance: cyber war.

Experts say Pyongyang typically deploys it about once a year, although it’s not always clear that North Korea is behind the attacks.

The most recent offensive hit Seoul in April 2013. The strike disabled anti-virus software, brought down ATMs across the country and froze online banking systems for days. About 30,000 computers had their hard drives wiped and went dead.

“Hackers activated a “botnet” of 50,000 hijacked zombie computers.”

In an Austin Powers-style twist, the malicious software displayed pixilated skulls on the monitors of infected machines.

After initially saying the strike originated in China, officials tracked it to a specific Pyongyang neighborhood. A month before the assault erupted, they said, hackers had quietly planted a simple but devastating software program on computers at three South Korean television broadcasters and three banks. Authorities identified the code as a hard-drive wiper called “DarkSeoul,” first identified a year ago.

Although this type of virus is relatively simple and has been around since the early 1980s, experts acknowledged that its impact was devastating. A computer security expert from Cisco, Seth Hanford, wrote that the “highly targeted” attack led to significant downtime and a “severe” loss of data.

On April 12, North Korea denied it was the culprit, but the South has maintained the accusation.

Although North Korea is among the poorest and most isolated countries, it is surprisingly adept at hacking — a testament to how dangerously accessible cyber warfare is to anyone that wants to pursue it.

Training a cyber brigade, it turns out, does not demand high levels of tech sophistication, and is a handy way to pester a far stronger foe.

A convenient arsenal

On the Korean battlefield — which remains manned 60 years after the end of the shooting war that divided the Koreas — the North is indisputably outgunned and outmaneuvered. That fact has led Pyongyang to adopt a modified guerrilla warfare strategy. As the Pentagon described it in a May report to Congress: “North Korea uses small-scale attacks to gain psychological advantage in diplomacy and win limited political and economic concessions.”

In the 1970s and 1980s, Pyongyang sent agents on risky operations to sabotage South Korean targets and hijack one South Korean civilian airliner. In November 2010, the north launched an artillery barrage at an island near the DMZ, and sunk a South Korean naval corvette in March 2010, leaving 46 South Korean sailors dead.

Strikes like these, however, can provoke dangerous retaliation. In contrast, cyber warfare supports the nation’s military strategy, and carries less risk.

A digital offensive requires a “very low developmental cost and can bring catastrophic results,” said Hyeong-wook Boo, an analyst at the Korea Institute for Defense Analyses, a think tank in Seoul. “The North Korean espionage team sees cyberspace as a very favorable place for its activities.”

The threat has been looming since the late 1990s, when North Korea unleashed its first basic denial of service (DDoS) attacks on its neighbor. Since then, the computer plots have become somewhat more sophisticated, targeting South Korean banks and businesses with malware and throwing the occasional wrench in the markets.

According to the National Intelligence Service, South Korea’s spy agency, the north was probably behind six cyber attacks from 2008 to 2012. Two of the largest came in 2009 and 2011, when Seoul accused the North of sneaking malware into its biggest banks and attacking government websites.

In the first of these, the US was also a key target.

Starting on July 4, 2009, hackers activated a “botnet” of 50,000 hijacked zombie computers to coordinate three waves of assaults targeting the public websites of the Pentagon and White House. The denial of service attacks also disrupted the websites of the South Korean intelligence agency and a major South Korean newspaper, but did not bring them down completely.

Two years later, Seoul accused North Korea of unleashing a far stronger salvo of denial-of-service attacks on government and banking cyber-networks. The South Korean government said that North Korean hackers had gained control of the laptop of an IBM employee, who was a cyber security contractor for the large Korean bank, Nonghyup. (IBM did not respond to calls seeking comment.)

The sleuths managed to access the company’s entire banking system. The attack was contained by government-backed antivirus programs, but authorities admitted they were worried by the magnitude of the onslaught.

The March, 2011 attack turned out to be the most devastating so far. The episode, if committed by the North Korean military, demonstrated that while North Korea still hasn’t reached an incredible sophistication in its hacking brigade, it still has the potential to wreak havoc with a well-placed and well-timed assault.

“They targeted the spots they’ve always wanted to target,” said Andrei Lankov, a North Korea expert at Kookmin University in Seoul. “It is highly likely that this was committed by North Korea. They hit the banks, because they’ve always want to inflict damage on the South Korean economy. Their reason for attacking the media was to show contempt for them as mouthpieces.”

Un-wired

Remarkably, Pyongyang has emerged as a major force in digital war despite being a cyber exile.

According to World Bank statistics, internet use in North Korea is the lowest in the world, at 0 percent of the population. Rounding error renders that figure slightly misleading: The regime does in fact dole out global internet access to trusted cadres. Experts estimate that the number of users is in the hundreds.

Owning a private computer is banned, but the regime has distributed about 4 million computers to its 24.4 million citizens, who are allowed to access a handful of closed-off intranets closely vetted by the government, using its own operating system, called “Red Star.”

There’s only one internet café, in Pyongyang.

How North Korea conjures a skilled cyber brigade from its decidedly unwired ranks is impossible to know for certain, but experts have pieced together a picture based on reports from refugees and on the methods used to infiltrate southern computer systems.

The North Korean military runs a cyber warfare command officially known as “Unit 121,” reporting to the all-powerful General Bureau of Reconnaissance, alleges the North Korea Intellectuals’ Solidarity, a group of professors and intelligentsia who have escaped to the South.

The reconnaissance body is the north’s top spy agency, thought to have masterminded a line-up of conventional and cyber strikes on the south. Its chief, General Kim Yong-chol, is believed to have devised the sinking of a South Korean corvette.

The size of the cyber brigade and the nature of their work remain a matter of debate. Kim Heung Gwang, a former computer science professor in Pyongyang and head of the defectors’ group, told GlobalPost that Unit 121 consists of two otherwise-nondescript buildings in a suburb of Pyongyang. Other defectors have said the North Korean military harbors between 500 and 3,000 battle-hardened techies in the unit.

The regime takes notice of children who show mathematical talent, and gives them rigorous training at elite elementary and middle schools, defectors say. The brainy sprats later make their way to elite North Korean universities – such as Kim Il Sung and Kim Chaek University of Technology — from which they are formally recruited into elite cyber circles.

Not all of the computer students go on to the military. Many prodigies end up serving their country in less sinister ways. Some eventually join a handful of semi-public firms, such as one German-started foreign company, Nosotek, which programs mobile phone applications, earning excellent wages by North Korean standards.

But a lot of them prefer crafting viruses, hoping that it will land them lucrative jobs among the power elite, reports the North Korean news blog New Voices International, a website that interviews defectors.

At least one hacker has defected to the south by way of Southeast Asia in late March 2013, according to the South Korean government. (GlobalPost’s efforts to contact the hacker through two defector organizations were unsuccessful.)

Most refugees are poor, or come from backgrounds that are politically disfavored in the far north, where a nationwide caste system, known as songbun, prevails. The hackers, on the other hand, live well, in communal homes that are luxurious by North Korean standards, according to the Intellectuals’ Solidarity. Because they live comfortably they have less reason to flee.

But others contend the North Korean cyber threat is actually a paper tiger. Joo Seong-ha, a North Korean refugee and journalist, wrote in the conservative newspaper, Dong-a Ilbo, that the country is home to 10 teams of five or fewer cyber-warriors each. He also said that, according to his own interview with a hacker who defected, aging conservative leaders hardly offer them support because they don’t grasp the concept of cyber-warfare.

The source claimed that the regime sends 10 computer engineers each year to study in India, a coveted destination for training. While many see North Korea as an isolated state, it is common for the government, and organizations affiliated with top leaders, to send citizens overseas for all sorts of goodwill exchanges and even art projects. 

Seoul’s cyber defense

For years, experts have urged the South Korean government and private companies to step up the protection of their IT systems.

By some measures, South Korea tops Asian countries in prevalence of computer viruses and malware. In April 2013, Microsoft reported that South Korea had the highest number of computers reporting the detection and removal of malware using its antivirus software.

To hold off incursions from the North, Seoul recently announced it was improving security at its 23 nuclear power plants by separating their networks from the internet — a key step in preventing such a utility from being brought down by hackers.

The US and South Korea have also announced a new cyber-defense counterstrategy, but have refrained from disclosing the details. “The US and South Korean militaries will cooperate to develop diverse deterrence scenarios against hacking attacks and increase anti-cyber warfare forces to over 1,000 to better deal with emerging threats from countries like North Korea,” a defense ministry spokesman told IDG News Service in Seoul.

 

New Global Navy Challenges U.S. Dominance on the Seas

GovExec

By Steve LeVine

May 17, 2013

 

Offshore from Syria, Russia’s navy is conducting probably its largest naval deployment outside its own waters since the Soviet breakup. The Chinese navy is in another potential confrontation today with Japan in the East China Sea, and raising questions about where it is headed next.

But the BRIC nations as a whole—a force in the global economic conversation since the acronym was coined by Goldman Sachs to refer to the high-growth economies of Brazil, Russia, India and China—are becoming an increasing naval presence on the high seas. One reason is simple nature—when nations become wealthier, they tend to build up their fighting capabilities. But another is natural resources—all four nations either want to buy or sell oil and natural gas, and they are venturing further and further to do so.

A paradox is that while the shift challenges US primacy on the high seas, the US itself—because of its oil and gas boom—is driving part of the BRIC naval expansion.

Because it is providing for more and more of its own energy requirements, the US is importing much less African and Middle East crude, and the chief new buyers replacing it are BRIC nations—the US is about to be displaced by India as the largest buyer of Nigerian crude oil, for example. “It is only a matter of time before we see Indian ships in the South Atlantic [to patrol the coast of West Africa],” Brahma Chellaney, of the Center for Policy Research in New Delhi, told the Financial Times.

http://www.govexec.com/technology/2013/05/new-global-navy-challenges-us-dominance-seas/63240/

 

DISA Cancels $45 Million Award to Store Exabytes of Drone and Satellite Images

NextGov

By Bob Brewin

May 21, 2013

 

The Defense Information Systems Agency has abruptly canceled a $45 million sole source contract awarded in April to Alliance Technology Group to store of hundreds of billions of satellite and drone imagery files, each packing terabytes of data.

DISA said in documents posted Monday that after it announced its decision on April 4 to award the Large Data Object Storage contract to Alliance, a small, disadvantaged business located in Hanover, Md., other vendors responded and based upon capability statements and responses received, it planned to “to pursue competitive means through the National Security Agency Acquisition Resource Center to satisfy the requirement.”

DISA said previously it would use the commercial cloud provided by Alliance to store imagery from intelligence, surveillance and reconnaissance, or ISR, platforms, including from drones and satellites that transmit, among other things, full motion video files that contain megabytes or more of data, depending on the length of the transmission.

 

Alliance partners with Hewlett-Packard and IO Data LLC to provide cloud storage services for petabytes of data. Competitors include EMC Corp. of Hopkinton, Mass.; Caringo Inc of Austin, Texas; DataDirect, Chatsworth, Calif.; and San Francisco start-up Scality.

DISA expects that eventually the ISR cloud will exceed an exabyte, or one million terabytes, within the next few years and “may grow to 3-4 exabytes in the out years,” but the agency did not specify how many years.

http://www.nextgov.com/cloud-computing/2013/05/disa-cancels-45-million-contract-store-exabytes-drone-and-satellite-images/63401/

 

Massive Catalog of Streaming Government Data Coming Thursday

Nextgov

By Joseph Marks

May 22, 2013

 

Government data officials have nearly completed an exhaustive list of nearly 300 application programming interfaces that will allow outsiders to stream up-to-date information from government agencies straight to their computers, websites and mobile apps.

The final version of the federal API catalog will be released Thursday on the government dataset trove Data.gov to mark the one-year anniversary of the White House’s federal digital strategy, the site’s administrator Jeanne Holm told Nextgov by email Wednesday.

http://www.data.gov/home

 

A nearly complete version of the API catalog includes hyperlinks to about 280 government APIs, listed individually and broken down by federal department and agency. Holm called the current site a “transparent work in progress.” Officials will continue to add more APIs to the list after Thursday as agencies launch them, she said.

An API is essentially computer code that allows one machine to automatically gather updated information from another. A community organization could use the API for a national farmers’ market database recently launched by the Agriculture Department, for instance, to stream information about local farmers’ markets on its website.

APIs were a key component of the digital strategy, which required agencies to have at least two of them up and running by the strategy’s one-year anniversary. (The official deadline arguably won’t come for several months because it was also tied to the six-month anniversary of a government open data policy, due in November 2012, that wasn’t published until earlier this month).

A major goal for the API program is that private sector and non-profit developers will build mobile apps and other products off of streaming government data about home prices, health outcomes and other topics, either to serve the public, to turn a profit or both. One model for the initiative is the multi-billion industry built off government-gathered Global Positioning System data, which is used by industries ranging from airlines to mobile app developers.

As of Wednesday, the Data.gov API list included the makings of at least two APIs for every Cabinet-level agency and more than a dozen from some agencies such as the Interior Department. Some of the links led to works in progress, though, rather than completed APIs. Several led to static spreadsheets, for instance, and lacked the key API ingredient — detailed instructions for how to write a script that will continuously pull new information from a particular data source.

 

Air Force Museum to get fourth building

Published : Wednesday, 22 May 2013, 5:09 PM EDT

Alexandra Newman

 

DAYTON, Ohio (WDTN) – Officials at the National Museum of the U.S. Air Force announced on Wednesday the museum is getting a new fourth building.

The building will include aircraft from the Research and Development Gallery, along with a new Presidential Gallery, an expanded Space Gallery and select global reach planes. The addition of many of the R&D aircraft will allow visitors to see the popular XB-70, X-1B and the “flying saucer-like” Avroar, among others.

Previously, museum visitors without military base access were required to ride a shuttle bus to the Presidential and R&D Galleries, which are currently located on a controlled-access portion of Wright-Patterson Air Force Base. That part of the base was closed on May 1 until further notice as part of budget reduction requirements due to sequestration.

The new fourth building plans will provide the museum with more educational outreach opportunities and be a great convenience for the public. Construction on it will begin in 2014 and it will open to the public in 2015. The Air Force Museum Foundation continues to fundraise for the fourth building, with $38 million secured in cash and pledges to meet their $46 million goal.

Museum officials say that the fourth building will help tell the Air Force story with the addition of many rare, one-of-a-kind aircraft.

 

 

Will Vertical Turbines Make More of the Wind?

A Caltech researcher thinks arrays of tiny wind turbines could produce cheaper power than big ones.

From MIT Technology Review

By Kevin Bullis on April 8, 2013

http://www.technologyreview.com/news/513266/will-vertical-turbines-make-more-of-the-wind/#ixzz2UDitpFUa

 

Why It Matters

Though competitive in some cases, wind power needs to get cheaper still to displace large amounts of fossil fuels.

The remote Alaskan village of Igiugig—home to about 50 people—will be the first to demonstrate a new approach to wind power that could boost power output and, its inventors say, just might make it more affordable.

For decades, the trend across the wind industry has been to make wind turbines larger and larger—because it has improved efficiency and helped lower costs.

John Dabiri, a professor of aeronautical and bioengineering at Caltech, has a heretical idea. He thinks the way to lower the cost of wind power is to use small vertical-axis wind turbines, while using computer models to optimize their arrangement in a wind farm so that each turbine boosts the power output of its neighbors.

Dabiri has demonstrated the basic idea at a 24-turbine test plot in southern California. Grants totaling $6 million from the Gordon and Betty Moore Foundation and the U.S. Department of Defense will allow him to see if the approach can lower wind power costs in Igiugig. The first 10 turbines will be installed this year, and the goal is to eventually install 50 to 70 turbines, which would produce roughly as much power as the diesel generators the village uses now. Dabiri is also installing turbines at an existing wind farm in Palm Springs, California, using his models to generate power by putting up new turbines between existing ones.

Ordinarily, as wind passes around and through a wind turbine, it produces turbulence that buffets downstream turbines, reducing their power output and increasing wear and tear. Dabiri says that vertical-axis turbines produce a wake that can be beneficial to other turbines, if they’re positioned correctly.

The blades of this type of wind turbine are arranged vertically—like poles on a carousel rather than spokes on a wheel, as with conventional wind turbines. Wind moving around the vertical-axis turbines speeds up, and the vertical arrangement of the blades on downstream wind turbines allows them to effectively catch that wind, speed up, and generate more power. (The spinning blades of a conventional wind turbine would only catch some of this faster wind as they pass through it—this actually hurts the turbine’s performance because it increases stress on the blades.) The arrangement makes it possible to pack more turbines onto a piece of land.

Dabiri’s wind turbines are 10 meters tall and generate three to five kilowatts, unlike the 100-meter-tall, multi-megawatt machines in conventional wind farms. He says the smaller ones are easier to manufacture and could cost less than conventional ones if produced on a large scale. He also says maintenance costs could be less because the generator sits on the ground, rather than at the top of a 100-meter tower, and thus is easier to access. The performance of the wind farm at Igiugig will help determine whether his estimates of maintenance costs are correct.

Dabiri says small, vertical wind turbines have other advantages. While the noise of conventional wind turbines has led some communities to campaign to tear them down, his turbines are “almost inaudible,” he says. They’re also less likely to kill birds. And their short profile has attracted a $1 million grant from the Department of Defense to study their use on military bases. Because they’re shorter, they interfere less with helicopter operations and with radar than conventional wind turbines.

The approach, however, faces some challenges. Vertical-axis wind turbines aren’t as efficient as conventional ones—half of the time the blades are actually moving against the wind, rather than generating the lift needed spin a generator. As the blades alternatively catch the wind and then move against it, they create wear and tear on the structure, says Fort Felker, director of the National Wind Technology Center at the National Renewable Energy Laboratory. Dabiri, and researchers such as Alexander Smits at Princeton University, say they are working on improved turbine designs to address some of these issues.

Felker notes that Dabiri’s approach will also require installing a thousand times more wind turbines, requiring potentially millions of wind turbines rather than thousands to generate significant fractions of U.S. power supply. And he notes that, over the last several decades, the wind industry has demonstrated that making ever larger wind turbines lowers costs (“Novel Designs are Taking Wind Power to the Next Level,” “Supersized Wind Turbines Head Out to Sea,” and “The Quest for the Monster Wind Turbine Blade.” “Going in the other direction, I believe, will not be successful,” he says. “I don’t think the math works out.”

Felker thinks that Dabiri’s approach might prove fitting for small, isolated places like Igiugig, where simpler construction and maintenance might be important. “But if you’re trying to transform the overall energy economy,” he says, “you’ve got to go big.”

 

 

New Mexico drone testing center could soon face competition

Dan Mayfield Reporter- Albuquerque Business Journal

May 22 , 2013

http://www.bizjournals.com/albuquerque/news/2013/05/22/drone-test-center-could-face-competition.html?page=all

 

New Mexico could lose its lead on unmanned aerial vehicle development when the Federal Aviation Administration lays out a new set of rules that allow more airfields to test UAVs, a state leader in the industry said.

The Unmanned Aircraft Systems Flight Test Center, an airfield run by New Mexico State University and the FAA, is the only FAA-approved UAS, or drone, flight test center in the nation. The center logs data from about two flights daily of commercial drones, which weigh anywhere from 15 ounces to 600 pounds.

But last year the FAA said it will authorize six more sites for drone testing.

“As soon as the FAA opens the door, there will be an explosion,” said Dennis “Zak” Zaklan, the director of the UAS FTC. “At this point, nobody knows. It depends on the FAA. It creates competition.”

There are six sites under consideration for drone testing across the country, from South Dakota to Florida and Hawaii.

“Unmanned aircraft can help us meet a number of challenges, from spotting wildfires to assessing natural disasters,” said U.S. Transportation Secretary Ray LaHood in a press release about the sites. “But these test sites will help us ensure that our high safety standards are maintained as the use of these aircraft becomes more widespread.”

The UAS FTC can fly unmanned vehicles in a 15,000-square-mile airspace from the border north to Socorro, and west to Lordsburg, from Las Cruces.

“Nobody will have the airspace like we have,” Zaklan said.

The FAA rules have not been released yet, and the administration’s goal is to designate the six new sites by 2015.

The UAS FTC has garnered attention, including from Fortune magazine, which named New Mexico as one of the nine places where the drone industry is expected to grow.

The New Mexico Economic Development Department also touts the site in its marketing materials.

So far, Zaklan said, the site has tested drones for several entities, including AAI, Aerovironment, Aurora Flight Sciences, the Department of Defense and NMSU’s own Aerostar drones.

The center has five full-time employees and seven part-time employees, who have flown more than 400 missions for the FAA.

“The civil uses [for UAVs] are growing tremendously,” Zaklan said.

But, he added, “There’s a lot of manufacturers sitting on the fence waiting on the FAA.”

 

 

9 states poised to dominate the drone economy

These areas are likely to reap the benefits of the boom in unmanned aircraft.

By Clay Dillow @FortuneMagazine – Last updated May 13 2013 03:27 PM ET

 

The West Coast

More specifically, California and Washington State. The AUVSI report lists them as the No.1 and No. 2 states to benefit from job creation and increased economic activity resulting from the UAS industry, which should come as no surprise considering that the duo are also the top two states for the larger aerospace industry. Though headquartered in Chicago, Boeing’s (BA, Fortune 500) major manufacturing operation is based in Washington along with subsidiary Insitu, which specializes in unmanned aircraft. It’s also home to Pacific Northwest National Lab, a federally funded lab under the Department of Energy whose partial mission is to develop counterterrorism technologies through enhanced information gathering and information analysis — activities for which UAS are well-suited. A consortium including PNNL is currently lobbying the FAA to place a test site in central Washington, which would solidify its place as a UAS industry center of gravity.

 

But the beating heart of the UAS industry on the West Coast and in the nation at large is further down coast in California, where powerhouse defense research labs belonging to the likes of Northrop Grumman (NOC, Fortune 500) and Lockheed Martin (LMT, Fortune 500) are scattered among storied military flight-test facilities like those at Edwards Air Force Base. Where defense is concerned, California already has the U.S. drone industry cornered. General Atomics’ Predator and Reaper drones, now famous/infamous for their roles in the CIA’s targeted strike programs in places like Yemen and Pakistan, roll off the assembly line in California, and small UAS maker Aerovironment — based in Monrovia, Calif. — supplies the Department of Defense with the vast majority of its unmanned aerial systems, mostly small surveillance drones like the man-portable Raven UAS that infantry can carry in a backpack and launch by tossing into the air like a football (pictured above). California is also aggressively pursuing an FAA test site designation, and with its gentle climate and varied geography (including plenty of maritime environment) it seems a strong candidate to receive it. The state is the center of the UAS industry as it stands right now, and that’s unlikely to change in the foreseeable future.

 

Texas

Texas is also riding high on AUVSI’s state-by-state UAS industry rankings at No. 3. That’s largely because, like California, Washington, and Florida, its existing aerospace and defense industries are strong — Lockheed Martin has a large presence there, as does Bell Helicopter, General Dynamics (GD, Fortune 500), Boeing, Raytheon (RTN, Fortune 500), Rockwell Collins (COL, Fortune 500), and many others. The state of its engineering education and workforce is likewise strong, with universities like Texas A&M taking an active interest in UAS development and traditional aerospace anchors like Johnson Space Center drawing the right kind of educated workforce to the state.

However, civilian skies are not poised to be filled with militarized MQ-9 Reaper drones — rather, small UAS (under 55 pounds) will drive the coming drone industry boom, and Texas has its fair share of small UAS makers emerging as well. Despite its name, Conroe’s Vanguard Defense Industries also makes small unmanned helicopters suited for non-military applications like facility security, surveying and cartography, infrastructure inspection, and other domestic/civilian uses. Likewise, Austin’s DJI Inc. produces small UAS suitable for similar aerial photography roles as well as onboard autopilot modules and other flight components for small UAS.

 

But that’s still not the most notable thing about Texas. On top of its generally business-friendly climate, deep aerospace roots, and a varied geography and climate (for flight test purposes), Texas as a state is fairly friendly to drones. Sure, there are a couple of voices in Austin calling for UAS restrictions arising from privacy concerns, but Texas has been quite progressive where public safety UAS are concerned. A Houston-area Sheriff’s department has operated an unmanned helicopter for more than a year (though it recently ran into a bit of trouble with the FAA), and Arlington police have secured certification from the FAA to operate two unmanned helicopters during police operations there. That may not sound like much, but in terms of drone adoption it’s pioneering. While some states — Idaho and Virginia come to mind — are already passing laws to restrict drone usage before they are even being used, from a policy standpoint Texas is thus far keeping a very open mind. Ultimately that’s going to be very important to companies looking for states in which to set up shop.

 

Ohio & Indiana

California, Washington, Texas, and Florida might seem like safe bets for states that stand to prosper from an aerospace-related boom, and they are, for various reasons. They’re the top-ranking states in the AUVSI analysis (followed by Arizona, Connecticut, Kansas, Virginia, New York, and Pennsylvania). But as mentioned previously, there are still plenty of variables that could shift the national composition of UAS-related jobs, revenues, federal and state investment, and overall economic impacts. And if you’re not at the top of the pack where existing aerospace infrastructure is concerned, you can always up your drone economy cred by joining up with the state next door.

That’s exactly what Ohio and Indiana have done. Long economically linked by their shared Rust Belt heritages, Ohio and Indiana have combined their bids for an FAA test site. And there’s no reason to think this dark horse candidate might not win it. Both are middle-tier aerospace states by most metrics, but together they make a more attractive package. The test site would likely be just outside of Dayton, Ohio, near Wright-Patterson Air Force base, a hub for UAS sensor payload research and development and home to Air Force Research Laboratory, which is exactly what it sounds like. Given that America’s armed forces are now training more drone pilots than human fighter pilots, one can easily surmise what kind of research is going on there.

Moreover, if California and Florida are attractive for their year-round mild weather, Ohio and Indiana are attractive for the opposite reason: The weather there can be pretty rough. UAS need to be able to operate reliably in all weather conditions, and an Ohio/Indiana test site could subject them to all four seasons, including that brutal Great Lakes winter sleet/snow/wind/slush that challenges any aircraft, manned or unmanned. This partnership has a solid shot at winning an FAA test site designation, and if it does it could pull a lot of research and development resources into its orbit.

 

Florida

Florida possesses a lot of the same advantages as California — a robust existing aerospace and military presence, a vast maritime environment for UAS testing, easygoing year-round weather for flight testing. But there’s another reason to like Florida: Education. Between NASA’s Space Coast and a sizeable military aerospace presence, Florida is already home to a workforce of highly-skilled aerospace personnel — many of whom are currently looking for something new to occupy them as the Space Shuttle program has wound down — and it’s generating more all the time. Embry-Riddle Aeronautical University in Daytona Beach is one of the few American universities offering a specialized program in UAS design and operations — something UAS designers and manufacturers desperately need as their industry grows (UAS engineering is more like a blend of the traditional disciplines of aeronautical, electrical, and software engineering, plus a minor in robotics). AUVSI lists Florida as No. 4 in its ranking of states poised to benefit from the integration of UAS into the national airspace, but with such a solid aerospace engineering foundation, it wouldn’t be surprising if it were to challenge Texas for its number three slot.

 

New Mexico

New Mexico is an interesting case because of its overall averageness, save a few key aspects. In the larger national aerospace picture, New Mexico sits right in the middle of the pack (a 2012 Deloitte evaluation of the U.S. aerospace and defense industry by state ranked New Mexico 29th in aerospace/defense employment, 25th in average wages in the sector, and 34th in total revenues derived from the industry). But with the FAA still evaluating bids for its six UAS-dedicated test sites, New Mexico’s Physical Science Laboratory (administered by New Mexico State University) is home to the only FAA-approved UAS test facility in the U.S. — the Unmanned Aircraft Systems Flight Test Center — where companies from out of state like Aerovironment (AVAV), Boeing, and AAI Corporation have for years trucked their UAS for flight tests. With an operational framework and solid reputation already in place, the Flight Test Center is years ahead of any new test facility that might be designated in the coming months.

Moreover, the U.S. government — particularly its defense infrastructure — has already deemed New Mexico an important state for technology development and has cultivated an educated native workforce there. New Mexico is home to massive test ranges at White Sands (both for military missiles and NASA’s space rockets), Spaceport America, the Air Force Research Laboratory’s Space Vehicles Directorate at Kirtland Air Force Base, Sandia National Laboratories (a government-owned research center operated by Lockheed Martin), and Los Alamos National Laboratory. As a state, it doesn’t have to worry about importing engineering talent or where the FAA chooses to place its new test facilities. It also supports various state tax deductions for aerospace companies as well as incentives for capital investment and workforce development in the field. As such, many aerospace companies large and small maintain outposts in New Mexico, and the climate is favorable to those wishing to launch a UAS concern.

In other words, though the UAS industry isn’t necessarily booming there at present, all the ingredients are there for New Mexico’s drone industry to continue gathering momentum.

 

Oklahoma

Oklahoma is another state that, where the larger aerospace picture is concerned, sits squarely in the middle of the rankings by just about all important metrics. But when one ticks through the list of things that will be important to UAS designers and manufacturers going forward, the Sooner State distinguishes itself in just about every way.

Tinker Air Force Base near Oklahoma City (pictured above) is a major Department of Defense air outpost (the Navy is also resident there) that is home to one of the Air Force’s major aircraft overhaul and repair centers, requiring a local workforce that is aerospace-inclined. The state’s private aerospace sector has long serviced just about every segment of the larger aerospace industry, and about 15 existing companies already work directly in the UAS space on everything from sensor payloads to components for low-cost hobby UAS.

Oklahoma State University is the first four-year university in the nation to offer graduate degrees (both a master’s and a Ph.D.) specific to unmanned aerial systems engineering, and through the school’s University Multispectral Lab the state has long served as a test bed for advanced military technologies, including unmanned systems.

Perhaps most importantly, Oklahoma is aggressively pursuing UAS-oriented aerospace businesses like almost no one else. The state is offering copious incentives for companies in the UAS space to settle there, and it could benefit somewhat from its close proximity to aerospace/defense powerhouse Texas by swaying some companies with roots south of the Red River to open up facilities on the Oklahoma side, especially if Oklahoma lands an FAA test site certification. It’s not in any position to unseat California, Washington, Florida, or Texas at the very top, but among mid-sized, middle-ranking aerospace states with huge potential for growth in the UAS sector, Oklahoma is perhaps the most likeable.

 

Alabama

Where wooing new manufacturing business from out of town is concerned, there’s a lot to be said about Alabama. When European aerospace giant EADS was looking for a location to build a U.S. manufacturing facility for its Airbus A320 airliners, Mobile got the nod. The Mercedes-Benz manufacturing plant in Tuscaloosa has continued to steadily add jobs and facilities since the state successfully convinced the German manufacturer (part of Daimler AG) to locate there back in the mid-1990s. Alabama is aggressive when courting new manufacturing jobs, and it has a proven track record of landing big deals with big companies.

That on its own might not say much for its place in the UAS industry where much of the coming innovation and growth is just as likely to come from smaller aerospace concerns. But Alabama is also notable among U.S. states for ranking fifth in percentage of state GDP derived from the aerospace sector. Going back to the early stages of the space race, Huntsville has been and remains an anchor for U.S. aerospace innovation.

The Army’s Unmanned Aerial Systems Project Management Office is located outside of Huntsville at the Redstone Arsenal and often loops University of Alabama Huntsville students into its work, helping to educate a native local work force with experience in the field (its proximity to Florida’s aerospace-rich employment pool can’t hurt either). Whether or not all of that adds up to Alabama winning an FAA test site designation or whether it will break out to become a major UAS industry hub is still pretty unclear, but as a dark horse candidate there’s plenty to like in Alabama. And if it does win a test site certification the calculus for the state could change dramatically. Right now it’s a sleeper within the industry, but it might just be a sleeping giant.

 

Budget Wars Coming to Early Showdown — and Stalemate

By David Hawkings    Posted at 11:51 a.m. on May 20

http://blogs.rollcall.com/hawkings/budget-wars-coming-to-early-showdown-and-stalemate/

 

A routine committee meeting May 21 will formally lock down this reality about the congressional budget engine: it has totally seized up, and as early as ever — fully 20 weeks before it’s supposed to finish spitting out thousands of line-item decisions about discretionary government spending for next year.

The majority Republicans on House Appropriations will push through the spending caps they will use in drafting the dozen bills expected of them for fiscal 2014. All the Democrats will oppose the numbers, because they completely disregard one of the central tenets of the too-tough-to-swallow sequester that Congress swallowed anyway this year: The spending cuts are supposed to be as severe for defense programs as they are for domestic operations.

Instead, the House will set about drafting three measures — for the departments of Defense, Homeland Security and Veterans Affairs (which also includes military construction) — that in the aggregate would cut spending by less than 1 percent from current levels.

But to hold those national security efforts harmless, while still keeping all their decisions under an overall cap of $967 billion, the GOP appropriators are assigning shriveled-up bottom lines to the other nine bills devoted to domestic spending and foreign aid. They would face cuts of $72 billion, or 17 percent, from current levels. Almost half of the reduction, or $35 billion, would come from imposing a 22 percent cut on the social programs that Democrats are most interested in preserving at the departments of Labor, Education and Health and Human Services.

Appropriations Chairman Harold Rogers, R-Ky., has absolutely no expectation that depth of domestic cuts will be realized. Instead, his proposed grand totals for the spending bills — known in Hill shorthand as the 302(b) allocations — are obviously his opening bargaining position for the inevitable negotiations at the end of the year with Senate Democrats and the Obama administration.

The numbers are also a signal of how House GOP leaders will seek to leverage that bargaining position this summer: They will push the three national security bills through the House along with Rogers’ favorite domestic measure, for the Agriculture Department. And then their side will stand pat, knowing the rest of the bills have little chance of passing the House.

Rogers’ hope is that, by fall, some sort of deficit-reduction deal might be engineered that does away with the sequester, allows the level of national security spending the GOP insists on and also allows him to ameliorate the non-starter cuts he’s now got penciled in for domestic programs.

At the head of the other side of the bargaining table is Senate Appropriations Chairwoman Barbara A. Mikulski, D-Md., who is preparing to unveil her initial bargaining posture in two weeks, right after the Memorial Day recess. It’s a very safe bet that she will do the opposite of what Rogers did, holding the social programs harmless and calling for defense budgets to take the most significant hits.

The one top-line fact she’s made clear is that her 302(b) allocations will add up to $1.06 trillion. That’s $91 billion, or 9 percent, more than the House bottom line, because Mikulski and the Senate Democratic leadership’s bargaining starts with the assumption that the sequester will be canceled by the fall, with or without a taxes and entitlements budget deal that promises an equivalent amount of deficit reduction.

And, with that as her opening posture, it’s very tough to see how she’ll get any of her bills passed by the Senate, or even through the theoretical filibusters that will surely be mounted by GOP budget hawks.

In other words, a flurry of activity on the spending front in the next month seems guaranteed to produce a long summer and early fall with no overt progress at all. And it’s inevitable that a
continuing resolution, or a series of them, will be needed to keep the government open and running in place long past the new fiscal year’s start on Oct. 1.

Another patchwork resolution between Thanksgiving and Christmas is by far the best bet.

 

 

Is This Google X’s Plan to Wire the World?

By Brad Stone

May 23, 2013

http://www.businessweek.com/articles/2013-05-23/is-this-google-xs-plan-to-wire-the-world#r=rss

 

Google (GOOG) Chairman Eric Schmidt’s April 13 tweet was bold, ambitious, and a bit inexplicable. “For every person online, there are two that are not,” wrote the co-author of the book The New Digital Age. “By the end of the decade, everyone on Earth will be connected.”

Commenters were flummoxed by Schmidt’s prediction. There are many parts of the world without reliable telecommunications infrastructure. How do you wire parts of Africa—or the Indonesian archipelago?

In my conversations with Astro Teller, Google X’s excellently named director of moonshots, for this week’s cover story on Google X, I asked whether extending broadband Internet access throughout the world would be a problem deserving of attention from the top-secret lab. Teller gave nothing away, but it was clear from his answer that there’s plenty of passion for that particular goal inside his organization. “Having everyone connected is literally in the same category as making clean water available,” he said. “That sounds like a radical statement but I don’t think that it is. There is now a ton of evidence that connectivity drives freedom, democracy, economic development, health, and those things then turn into lower mortality and all of the things that we are trying to get at here.” Extending connectivity, he added, “is the most direct way, probably on an order of magnitude, to address the world’s biggest problems.”

Researchers who have examined the challenge of spreading Internet access throughout the world usually focus on one of three general solutions. There’s satellite access, which tends to be slow, expensive, and doesn’t function well in high-density urban areas. There’s ground-based wireless broadband, the most conventional solution, but in some parts of the world the towers where you would mount broadband transmitters would be quickly scavenged and sold as scrap metal.

And then there’s the unlikeliest but perhaps most promising approach: sending balloons mounted with broadband antennas into the stratosphere, where they can rain connectivity down from only 20 kilometers away. The Europeans tried this a decade ago. Lockheed Martin (LMT) and SoftBank in Japan have experimented with it more recently, with varying levels of success. No one has tried to push balloon-based broadband transmitters into wide production, though. Google X representatives declined to comment on this particular approach, but it fits well with the lab’s philosophy of moonshot thinking and its orientation toward practical yet science-fiction-sounding solutions.

David Grace, a senior research fellow at the University of York, spearheaded the European project, part of a multi-country initiative backed by the European Commission. He said that he has indeed heard Google is working on such a project. “They are highly innovative and very obviously have the financial resources and are always keen to take risks,” he said. “It does need the Googles of the world to push this forward.”

Per Lindstrand, the Swedish balloonist, is probably the world’s top authority on this topic. He says free-flying high-altitude balloons are impractical because of high winds; release a balloon on the equator and in a few weeks it will end up at the North or South Pole. But solar-powered balloons packed with a fuel cell and an onboard motor can remain stationary for up to five years and are “perfectly feasible.” Lindstrand has been urging companies to more aggressively pursue balloon-based wireless networks but says that no one has stepped up yet. “Everybody keeps telling me, ‘Show me an airship and I will buy it.’ No one wants to risk the money. The key is to find somebody who is brave enough. I believe the stratospheric airship is a viable project, but it needs to be created by a small elite team with past airship experience and not by a conventional bloated aerospace contractor. If Google is really on the scene, they would be the perfect sponsor.”

 

SecAF: Sequestration hits AF readiness, modernization

Posted 5/24/2013

by Jim Garamone

American Forces Press Service

 

5/24/2013 – WASHINGTON (AFPS) — Sequestration has hit the Air Force particularly hard, impacting its force structure, readiness and modernization, senior Air Force leaders said here today.

Air Force Secretary Michael B. Donley and Gen. Mark A. Welsh III, the chief of staff, said Congress must provide a solid budget number so the Air Force can ground its planning in reality.

The Air Force understands it must do its part to work through the debt and deficit reduction problem, Welsh said.

“We just want to get to the bottom line or the new top-line budget … and get on with preparing our Air Force to remain the best in the world,” he said.

Sequestration has hit the Air Force hard and the effects are felt throughout the full range of accounts from force structure to readiness to modernization, Donley said during his last scheduled news conference as secretary.

On April 26, Donley announced plans to step down June 21 as the Air Force’s top civilian after serving as secretary for nearly five years.

“Twelve combat-coded squadrons have stopped flying, and important training has been canceled,” Donley said. “Weapon system sustainment reductions will delay maintenance, increase costs and create backlogs. The impending civilian furlough will hamper us further and will impact morale and reduce productivity across the Air Force.”

Even before sequestration there was a readiness crisis in the Air Force, the secretary said. “The readiness hole that we have been trying to dig out of just got deeper, and we are facing a readiness crisis from which it will take many months to recover,” he said.

And it is not just operations and readiness accounts that are at risk, said Donley, noting the Air Force needs modernization — in aircraft, missiles, and capabilities.

“As advanced technologies proliferate around the globe, these cutbacks in modernization would put at risk the Air Force capabilities this nation will need in the decades ahead,” Donley said. “Despite our near-term and long-term concerns, we are working to ensure that our most significant Air Force priorities remain on track, including the fifth-generation F-35 Joint Strike Fighter, the KC-46 tanker, and the long-range strike bomber.”

Aircraft must support the warfighters, but budget cuts mean that airmen cannot train for full spectrum operations, Welsh said.

“And our readiness continues to decline, even while calls for potential no-fly zone or air policing operations in response to Syrian violence are reaching a new crescendo,” he said.

“We’re still the best Air Force in the world,” Welsh said. “And our great airmen will rely on experience and their unmatched dedication to succeed in any operation that we’re asked to execute. But atrophied skills elevate risk, and stagnant proficiency will only grow over time if we can’t restore some sense of budget normalcy. And so that’s what we’re hoping for.”

 

 

Private Firm Sets Sights on First Moon Base

Discovery.com

May 24, 2013 11:52 AM ET // by Irene Klotz

 

NASA may not be going to the moon anytime soon, but private companies plan to do so, a study by space habitat developer Bigelow Aerospace shows.

The study, commissioned by NASA, is intended as a supplemental roadmap for the U.S. government as it charts human space initiatives beyond the International Space Station, a permanently staffed research complex that orbits about 250 miles above Earth.

“Instead of being the typical approach where we put together all the plans and we ask for participation, we wanted to look at it the other way and see what’s available,” NASA’s head of space operations, Bill Gerstenmaier, told reporters during a press conference on Thursday.

“This is a holistic kind of effort,” added Robert Bigelow, president and founder of the Nevada-based firm that bears his name. “It’s intended to encompass as much (information) as possible and it’s intended to evolve and to grow.”

The first part of the study surveyed about two dozen companies and research organizations about their ideas, plans, capabilities, schedules and costs for upcoming space initiatives. A draft report was submitted to NASA on Thursday — 40 days ahead of schedule — and has not yet been publicly released.

NASA intends to use the information to figure out where it can collaborate with private space initiatives and where it might, for example, entirely skip an expensive research and development program and just buy services or hardware commercially.

For example, after the International Space Station is removed from orbit, NASA could be a tenant aboard a Bigelow Aerospace-owned habitat for any microgravity research or technology development it wants to do.

“We think station can fly to 2028,” Gerstenmaier said.

After that, “we won’t be in the business of maintaining and operating a facility in low-Earth orbit. We believe that there will be a service available for us and the private sector,” he said.

NASA plans to follow the space station program with human missions to an asteroid by 2025 and to Mars about a decade later. The most recent version of its exploration blueprint calls for a robotic mission to retrieve a small asteroid and relocate it into orbit around the moon. Astronauts then would launch for a scientific sortie.

By that time, NASA may find it has neighbors on the lunar surface.

“The brass ring for us is having a lunar base,” Bigelow said. “That is a desire we’ve had for a long, long time.”

“I think that’s perfectly acceptable,” added Gerstenmaier. “NASA and the government focus on maybe deep space, we focus on asteroids. The private sector picks up the lunar activity and then we’ll combine and share with them to see what makes sense.”

“This gives us a chance to step back and do a bigger view of our planning and not doing it in our own little stovepipes. We’re actually reaching out and starting to look right at the beginning as we start to formulate our thinking,” he added.

NASA expects to release the first part of Bigelow’s study within a few weeks. The second section is expected to be finished this fall.

 

IEEE-USA Opposes H-1B Visa Increases, Agrees with Companies on Green Cards

http://www.todaysengineer.org/2013/May/immigration.asp

By Chris McManes

A leading expert on high-skill immigration and a Microsoft executive testified before the Senate Judiciary Committee on 22 April in favor of high-skill immigration reform legislation IEEE-USA supports.

Dr. Ron Hira, an associate professor of public policy at the Rochester (N.Y) Institute of Technology, said provisions to raise the H-1B temporary visa cap in “The Border Security, Economic Opportunity, and Immigration Modernization Act” (S. 744) would hurt U.S. engineers and scientists, as well as students.

“Employers will continue to bring in cheaper foreign workers, with ordinary skills, to directly substitute for, rather than complement, workers already in America,” Hira said in his written testimony. “Under this bill the H-1B program would continue displacing American workers and deny them both current and future opportunities. It also discourages American students from pursuing these professions.

“These problems will expand since the bill proposes to roughly double the number of H-1Bs, which potentially could grow to a tripling.”

IEEE-USA President Marc Apter, in a 6 May letter to Judiciary Committee Chairman Sen. Patrick Leahy (D-Vt.) and Ranking Member Sen. Chuck Grassley (R-Iowa), agrees the cap should not be increased.

“Recognizing the damage that outsourcing does to the U.S. economy, we do not see a justification for any H-1B increase,” Apter said. “An increase in H-1Bs beyond what is provided in S. 744 is unacceptable to us. We strongly support the worker protections that are in S. 744 and oppose any effort to weaken them. The fees imposed on the H-1B program are helpful, and we applaud their use to attract and retain Americans in STEM fields.

“But we cannot say that we support any H-1B increases.”

Brad Smith, general counsel and executive vice president for Microsoft Legal and Corporate Affairs, in his written testimony, also favors provisions in the bill IEEE-USA supports: Increasing employment-based green cards for high-skill employees, exempting dependents from the cap and removing per-country limits.

“Perhaps most importantly on the topic of green cards, the legislation before this Committee today recognizes the value to the U.S. economy of graduates from U.S. universities with an advanced STEM degree,” Smith said. “By exempting them from the overall green card quota, the bill provides a clear path to a green card for these highly sought-after individuals. Under current law, many face a wait of more than 10 years to obtain a green card, and they may decide that a career in the United States simply isn’t worth that kind of instability.

“As a result, we risk losing these experts to other countries, where they will compete against us. This bill goes a long way toward keeping their talents in the United States and helping to grow our economy.”

 

Putting America First

Sen. Lindsey Graham (R-S.C.), while questioning Smith, spoke of the legislation’s desire to help the American economy.

Graham: “So here’s the goal of the legislation – to incorporate those talented [foreign-born] people into the American business economy, is that correct?”

Smith: “Yes.”

Graham: “We don’t want to educate them in our finest universities [and] they go back to the country of origin and open up businesses against us. We would like for them to … use their talents as part of the American enterprise system, is that correct?”

Smith: “I wholeheartedly agree.”

Graham: “Well that’s what we’re trying to do, and do you believe this bill accomplishes that?”

Smith: “I am a strong supporter; we are a strong supporter of this piece of legislation.”

IEEE-USA has advocated for more employment-based green cards for engineers, technologists and scientists for years. The organization supports the legislation’s call for unlimited green cards for STEM Ph.D.’s and the requirement that employers applying for H-1B visas prove they have sought American workers.

“Replacing the H-1B model with a green-cards model for new hires in the technology sector is by far the best protection for American’s highly skilled workers, creating and keeping jobs in the United States,” Apter said.

 

H-1B Visas Promote Loss of American Jobs

Hira, a former IEEE-USA vice president, supports IEEE-USA’s contention that the H-1B program is harmful to U.S. and international employees. He cited a recent analysis of government data by Computerworld magazine which concluded that, “the major beneficiaries of the proposed increase in the [H-1B] cap would be pure offshore outsourcing firms.

The Department of Labor Office of Foreign Labor Certification found that the top 10 users of H-1B visas in FY 2013 continued to be offshore outsourcing companies. These 10 organizations collectively had 112,739 positions – representing 73.4 percent more than the base annual H-1B cap of 65,000 – certified to be filled by an H-1B worker.

In addition, 64.1 percent of the 175,806 certified H-1B applications from just 1 October to 31 December 2012 went to these 10 companies. And they are just 10 of the many offshoring specialists. The Labor Department began applying these applications to the FY 2014 visa cap in early April.

“While proponents of an H-1B visa increase are bemoaning the fact that the H-1B cap is already used up,” Apter said, “it was businesses that use the visas to take American jobs who used nearly two-thirds of them.”

“The majority of the H-1B program is now being used to hire cheap indentured servants,” Hira said. “The bulk of demand for H-1B visas is driven by the desire for lower cost workers, not by a race for specialized talent or a shortage of American talent.

“All of the top 10 H-1B employers used the program principally to outsource American jobs to overseas locations.”

Microsoft and other tech companies would like to see a huge increase in the H-1B visa cap, perhaps to as high 325,000 a year.

Sen. Dick Durbin (D-Ill.), who has sponsored legislation with Sen. Grassley to curb abuses in the H-1B and L-1 visa programs, said to Smith, “… Americans would be shocked to know that the H-1B visas are not going to Microsoft; they’re going to these [outsourcing firms], who are finding workers – engineers – who will work at low wages in the United States for three years and pay a fee to … these companies.

“I think that is an abuse of what we’re trying to achieve here. Most people would think, well Microsoft needs these folks, and they’d be shocked to know that most of the H-1B visas are not going to companies like yours. They’re going to these outsourcing companies.

“I sat at the [negotiating] table and said I’m for increasing H-1Bs only if we offer the job to an American first at a reasonable wage so that they have a chance to fill that position. If they can’t, then we’ll bring in the [foreign] talent.”

 

What’s on Deck?

Before the bill comes to the full Senate, it will marked up (amended) by the Senate Judiciary Committee. IEEE-USA would like to see the legislation maintain its original high-tech provisions.

IEEE-USA has no position on the bill’s non-high-tech sections.

An archived Webcast of the hearing, as well as all of the witnesses’ written testimony, is available at http://www.judiciary.senate.gov/hearings

 

What They Told Us: Reviewing Last Week’s Key Polls

Rasmussen Reports

Saturday, May 25, 2013

Call it the law of unintended consequences.

Favorables for the Tea Party have jumped since news broke that the Internal Revenue Service was targeting the grassroots movement and other conservative groups. Opinions of the IRS have gone down.

Only 20% of voters believe the IRS’ explanation that low-level employees at its Cincinnati office made the decision to target the conservative groups. Sixty-five percent (65%) think the orders came from Washington, including 39% who believe someone at the White House made the call.

Many expect more bad news to come. Sixty percent (60%) think it’s likely that other government agencies also targeted conservative groups. Perhaps most stunning is the fact that even 37% of Democrats think it’s likely other agencies may have been used to target conservatives.

While the president’s Job Approval rating has held fairly steady despite two weeks of controversies, Scott Rasmussen’s weekly newspaper column shows how “the political ground is shifting under the president’s feet.” He adds, “If you dig just a bit beneath the surface, it becomes clear that the controversies dogging the White House have had an impact.”

Just over half of voters consider all three controversies surrounding the White House to be scandals. Forty-one percent (41%) think the IRS story will still be around a year from now. Slightly more (43%) think there’s a good chance the Obama administration’s handling of the death of the U.S. ambassador in Benghazi, Libya will still be making headlines, too.

Despite increased media attention, voter concern about the events in Benghazi remains little changed over the past three weeks. Eighty-one percent (81%) still think it’s important to find out what happened. Thirty-two percent (32%) rate the administration’s explanation as good or excellent. Forty-five (45%) now view that explanation as poor.

The U.S. Justice Department’s secret seizure of telephone records from the Associated Press has been drawing a lot of news attention, although voters have mixed feelings about the story. But 52% think the media reacts more quickly to things that affect news organizations, reporters and their friends.

That could spell more bad news for the White House following the disclosure that the Justice Department also secretly obtained the telephone records of a Fox News reporter and his parents. “While the public is not up in arms over this issue, journalists are,” Scott Rasmussen noted in a recent column. “So there is likely to be more aggressive reporting on some of the other challenges facing the White House.” 

With growing questions about Benghazi, the IRS and the Justice Department, Democrats’ eight-point lead over Republicans in voter trust in the area of government ethics and corruption has disappeared. Now the president’s party trails the GOP by two. The newest finding is the highest level of confidence in Republicans and the lowest level for Democrats since October.

Republicans also have edged ahead of Democrats on the Generic Congressional Ballot for only the second week since Election Day.

Yet for the president and his party, the elephant lurking in the room continues to be the national health care law as it nears full implementation next year. Just over half of voters still view the health care law unfavorably, and most remain adamant that consumers should have choices when it comes to how much health care coverage they want to pay for.

Voters continue to believe raising taxes and increasing government spending are bad for the economy, but 56% expect spending to go up under Obama. Forty-seven percent (47%) think their taxes will go up, too.

Very few Americans think the federal government gives too much financial help to victims of disasters like this week’s Oklahoma tornado, but they’re evenly divided when asked if the government should make cuts elsewhere in the federal budget to offset this aid.

Consumer confidence remains a rare bit of good news for the president: It’s still tracking just below the highest levels in recent years.

Working Americans are pretty satisfied with their jobs, too, from what we can tell. It’s true that 33% work more than 40 hours a week, and only 48% are now happy with the hours they work.

But 64% are generally pleased to tell people where they work.

Sixty-nine percent (69%) of Working Adults give their boss or supervisor good or excellent marks. Eighty-seven percent (87%) rate their relationships with coworkers just as positively.

Fifty-eight percent (58%) would continue to work even if they were left a small fortune so that they could make the same amount of money without working. Ninety-one percent (91%) say that, if given the choice, they would stay with a job they love over another job they’d hate that pays 10% more money.

In other surveys last week:

— Thirty percent (30%) of likely voters think the United States is headed in the right direction, virtually unchanged from a year ago at this time.

— Forty-eight percent (48%) of Americans are confident in the stability of the U.S. banking industry. By comparison, in July 2008, shortly before the Wall Street meltdown, 68% were confident in U.S. banks.

— The Federal Reserve Board continues to work hard at keeping the U.S. inflation rate down and interest rates under control, but most Americans expect their grocery prices to keep going up.  Forty-four percent (44%) believe interest rates will be higher a year from today.

— Both current Democratic frontrunners, Christine Quinn and Anthony Weiner, outpace Republican favorite Joseph J. Lhota in Rasmussen Reports’ first look at New York City’s 2013 mayoral race.

— Among registered New York City Democrats, Quinn leads Weiner 24% to 18%. Democrats pick their mayoral nominee in a September 10 primary.

Advertisements

From → Uncategorized

Comments are closed.

%d bloggers like this: