Skip to content

April 18 2015

April 20, 2015

18 April 2015


Blog URL




Pentagon Seeks Sensors That Last for Years

April 13, 2015 By Patrick Tucker

Reducing battery drain by a factor of 1,000 will enable true “persistent surveillance.”



For decades, the military has been looking for ways to collect data and information in places that it can’t put people. One method for doing that is cheap sensors that can be left on the battlefield for extended periods of time, long after soldiers have returned home. Case in point: the 1,500 unattended ground sensors that the Pentagon has requested for the hills and battlefields of Afghanistan.

Getting the most out of a network of sensors means extending battery life as long as possible. That, in turn, means redesigning the way sensors use power and also the way that they do the work that they do. Smarter sensors, ones that actually know what to pay attention to, as opposed to just sit and collect info, could allow for what the military calls “persistent surveillance” on an order of years.

The Defense Advanced Research Projects Agency, or DARPA, today announced a program called Near Zero Power RF and Sensor Operations, or N-ZERO, which will build a network of sensors that can determine what to pay attention to and when. They will have an innate ability to detect specific frequency signatures “such as the presence of a particular vehicle type or radio communications protocol,” according to DARPA.

“Our goal is to use the right signal itself to wake up the sensor, which would improve sensors’ effectiveness and warfighters’ situational awareness by drastically reducing false alarms,” program manager Troy Olsson said in a press release. N-ZERO will build sensors that draw just 10 nanowatts of power during the device’s “asleep yet aware” phase, as much power as a typical watch battery in storage and about 1,000 times less power than is consumed by typical sensors.

It’s a technology that will play a role in some current and future DARPA programs, such as the Upward Falling Payload program, which seeks to put a network of pods on the ocean floor, sleeping quietly, until they sense a particular threat and release their pods.

Undersea pods that lie in wait and spring into action only when they detect, say, a submarine, would cut down on the need for other submarines to collect data on undersea threats. Land-based sensors that can detect military vehicles moving across a border in places like Iraq (or elsewhere) could be connected to armed drones that could also be stationed to lie-in-wait until they are needed, potentially cutting down on the need for drone orbits (patrols) or satellite surveillance.


Persistent ground sensing could radically reduce the costs of gathering intelligence. Sensors that last for years on end would enable far greater capability for the Internet of Things. It could also turn the entire world into a vast, surveillable frontier.


Kendall: DoD Tech Needs Private Sector

By Andrew Clevenger 5:33 p.m. EDT April 13, 2015


WASHINGTON — The US Department of Defense needs to do a better job of engaging Silicon Valley in its acquisition process if the US is to maintain its military technology superiority, Pentagon acquisition head Frank Kendall said Monday.

Speaking at an event at The Brookings Institution, Kendall touted the potential of the Pentagon’s new Better Buying Power 3.0 program, which officials unveiled last week.

“3.0 is a focus on a culture of technical excellence,” he said, adding that the reform effort would not radically reshape the acquisition process. “This is more about continuity than it is about change. The idea here is a shift in emphasis, not a fundamental break with what we’ve done in the past.”

Technology poses a particular challenge, because the DoD’s structure does not make it attractive to many American technology companies, he said. For Silicon Valley giants with global reach, working with the DoD may not be worth the trouble.

“We don’t have enough business to make them excited,” Kendall said. Smaller start-ups, where lots of innovation happens, are often put off by the red tape that comes with working with the Pentagon, he said.

“At the end of the day, we have to provide them with an incentive to do business with us,” he said.

Kendall pointed to the F-35 joint strike fighter as an example of where the DoD’s slow-moving development process was outstripped by progress in the private sector. Three times during the fighter’s decades-long development, the project needed technology refreshes to keep it up to date with commercial developments.

Bill Lynn, the CEO of Finmeccanica North America and DRS Technologies, said that in the first half of the post-Cold War period, the DoD was a “net exporter” to the private sector of technologies it had developed in-house, like GPS and early Internet breakthroughs. In recent years, the Pentagon has become a net importer of technology that it makes operational for defense applications, with the private sector taking the lead in 3-D printing, nanotechnology and IT, the former deputy secretary of defense said.

“We’re just not agile enough, we’re not fast enough, we’re not modular enough” when it comes to military information technology, Lynn said.

Jason Tama, a US Coast Guard commander who is also a Brookings fellow, said part of the issue is the cultural divide between Silicon Valley and the Pentagon. In the private sector, entrepreneurs foster “a culture of trying lots of things and failing fast and failing often,” he said, which allows them to quickly get past ideas that don’t work to ones that do. Too often on large military contracts, developers invest up front in meeting the program’s requirements, then move forward.

There are 150,000 people working in defense acquisition, 90 percent of whom are civilians. Roughly three-fifths of the DoD’s acquisition employees are between the ages of 40 and 60, putting its workforce out of step with the private sector’s workforce, Tama said.


“If you want top talent on a technical level, you’re not going to build that on the inside of the US government,” he said.


How the NSA Is Using the Cloud To Thwart the Next Snowden

April 13, 2015 By Frank Konkel Nextgov


In a post-Snowden world, is it really a good idea to have analysts swimming around in one vast ocean of NSA secrets and data?

Almost two years ago, the National Security Agency forever lost its “No Such Agency” nickname at the hands of one of its contractors—a once-trusted insider by the name of Edward Snowden.

Snowden’s stream of leaked NSA secrets about classified surveillance programs shined the public spotlight on the clandestine government organization. Though the stream has now dissipated to a trickle, the impact to the intelligence community continues.

To privacy activists, Snowden’s leaks were a godsend. They forced a national discussion on government surveillance and even coaxing the likes of Director of National Intelligence James Clapper to admit the intelligence community needs to be more transparent.

Yet, the leaks have “had a material impact” on NSA’s ability to generate intelligence around the world, NSA Director Michael Rogers said back in February.

Within NSA’s Fort Meade, Maryland, headquarters, no one wants to face another Snowden. With NSA’s widespread adoption of cloud computing, the spy agency may not have to.


Could the Cloud Have Stopped Snowden?

NSA bet big on cloud computing as the solution to its data problem several years ago.

Following expanded legal authorities enacted after the Sept. 11, 2001, terrorist attacks, NSA and the other 16 agencies within the intelligence community began to collect a gargantuan amount of intelligence data: Internet traffic and emails that traverse fiber optic cables; telephone call metadata; and satellite reconnaissance.

Much of that intelligence piled up in various repositories that had to stock up on servers to keep up with demand.

NSA’s GovCloud—open-source software stacked on commodity hardware—creates a scalable environment for all NSA data. Soon, most everything NSA collects will end up in this ocean of information.

At first blush, that approach seems counterintuitive. In a post-Snowden world, is it really a good idea to put everything in one place—to have analysts swimming around in an ocean of NSA secrets and data?

It is, if that ocean actually controls what information analysts in the NSA GovCloud can access. That’s analogous to how NSA handles security in its cloud.

NSA built the architecture of its cloud environment from scratch, allowing security to be baked in and automated rather than bolted on and carried out by manual processes. Any piece of data ingested by NSA systems over the last two years has been meta-tagged with bits of information, including where it came from and who is authorized to see it in preparation for the agency’s cloud transition.

As NSA further transitions to using cloud, analysts will make better use of their time, making queries against one database instead of repeated ones against dozens of relational databases.

Data in the GovCloud doesn’t show up to analysts if they aren’t authorized, trained or cleared to see it, according to NSA Chief Information Officer Lonny Anderson.

“While putting data to the cloud environment potentially gives insiders the opportunity to steal more, by focusing on securing data down at cell level and tagging all the data and the individual, we can actually see what data an individual accesses, what they do with it, and we can see that in real time,” Anderson told Nextgov. “So we think this actually dramatically enhances our capability.”

NSA cloud strategist Dave Hurry further clarified NSA’s approach to securing data within GovCloud.

“We don’t let people just see everything; they’re only seeing the data they are authorized to see,” Hurry told Nextgov.

What about adventurous, negligent or potentially nefarious insiders? How exactly Snowden ferreted out NSA’s secrets for months across numerous databases and evaded detection remains uncertain. But what is clear is that his actions should have thrown up some Utah Data Center-sized red flags. They didn’t.

GovCloud’s other baked-in security features are likely to deter all but the boldest of would-be rogue insiders. In the past, Anderson said, disparate data repositories contained log files to track user behavior, but those logs “had to be manually reviewed.”

That’s not a good recipe to catch malicious behavior. GovCloud automates those monitoring processes and flags network security personnel, Anderson said, when a user attempts to “exceed limits of authority.”

In other words, if NSA had this cloud-based system in place two years ago, Snowden wouldn’t have made off with what NSA Deputy Director Richard Ledgett in a 2013 interview called the agency’s “keys to the kingdom.” According to NSA officials, if GovCloud works as they believe it will, Snowden may have never left Hawaii, where he lived and worked, without his actions raising alarm bells.

“The [GovCloud] system could prevent it,” Anderson said. “But what it would have immediately done is highlighted and told our network security heads that someone is pulling a lot of data.”

That information would “allow us to visit the individual,” or “we could shut it down at the point we saw it,” Anderson said.

“It would have prevented what Mr. Snowden did,” Anderson added.


More Than Just Security

More than simply Snowden-proofing its data, GovCloud’s other features make it even more attractive to analysts and top agency officials charged with protecting national security interests.

GovCloud’s architecture has a “fact-of” function that alerts analysts that additional data on a query may be available but inaccessible based on the analyst’s access controls.

Moving from a whole mess of stovepipes into a central cloud that has a lot more functionality gives us more capability.

“That’s what will tell you there is other data available; you just aren’t authorized to see it,” Anderson said. “And if you need to see it, here is who to contact.”


NSA’s cloud migration will also significantly beef up the agency’s ability to comply with a plethora of legal rules, mandates and executive order. Just as security is automated in NSA’s cloud, so too are compliance measures such as data preservation orders or data retention rules.

“We think from a compliance standpoint, moving from a whole mess of stovepipes into a central cloud that has a lot more functionality gives us more capability,” said Tom Ardisana, technology directorate compliance officer at NSA.

Old repositories operating on legacy architecture predate many more recent laws and policy changes. The USA Patriot Act of 2001, for example, was authored into law after some of NSA’s existing legacy repositories were built, so NSA’s only option to adhere to evolving policies was to “bolt on compliance,” Anderson said.

“Whenever you bolt on compliance to address a particular issue, there is always a second- and third-order effect for doing that,” Anderson said. “It’s an extremely manual process. There is risk built in all over that we try to address. The cloud architecture allows us to build those issues in right from the start and in automated fashion address them.”

As NSA further transitions to using cloud, analysts will make better use of their time, making queries against one database instead of repeated ones against dozens of relational databases.

NSA’s centralized cloud will also alleviate uncertainty regularly faced by analysts when they query databases.

After running a query, analysts sometimes wonder whether they are actually authorized to view certain data. That kind of doubt influences what and how analysts generate intelligence reports. With the GovCloud, on the other hand, analysts will have near certainty that they’re only seeing information they are supposed to see.


Moving More than Data to the Cloud

NSA has been slowly migrating users to its new cloud architecture to ease the transition, but the pace has begun to pick up speed. Three weeks ago, Anderson said NSA transitioned users off three of the biggest legacy repositories into its cloud environment. Those users include NSA personnel, Defense Department and other intelligence community personnel.

“It’s a huge step forward,” Anderson said.

It’s important to note it’s not just data moving to the cloud.

A big part of the process is also transitioning NSA’s applications and tools to the new environment. Even the way analysts interact with data is changing. The agency, Hurry said, has gone to great lengths to ensure a “minimum viable functionality” for its cloud architecture, orchestrating trial periods in which analysts conduct all their work in the cloud.

“Advantages we’re seeing include a faster time to market and improved analysis and analytics,” Hurry said.

The move has not come without obstacles. The cloud organizes data differently than old repositories, and some analyst methods do not translate to NSA’s cloud model. However, the agency is training analysts on new methodologies.

“One of the challenges we have had to work through is mixing new features and making sure there is no break in service for existing features analysts depend on,” Hurry said.

Closing down repositories filled with untold racks of servers the way NSA did three weeks ago will also save the agency money in operations and maintenance. Some of those systems are decades old. Servers housed in closed repositories will be destroyed and their data deleted, Anderson said.

In the coming years, closed repositories will come to signal the success of NSA’s bet on cloud computing. Will it prevent the next Edward Snowden-like attack? NSA officials are counting on it—but they’re counting on the cloud for a lot more than that.


A Global Consensus on Cyber Security Is Gaining Momentum

April 13, 2015 By Camino Kavanagh Council on Foreign Relations


Cybersecurity developments grab headlines. Everyone wants to know the tales of treachery and intrigue, who hacked who, and what was stolen or broken. Interest wanes, however, when the conversation switches to the drudgery of what is to be done, especially capacity building, which generally involves transferring knowledge and good practices to countries in the developing world so that they can improve their cybersecurity and participate on a more equitable basis in the digital economy. While it may be tedious work, it is critically important because the next billion Internet users will be from the developing world.

The topic of capacity building has become so important that the rather awkwardly-named Group of Governmental Experts (GGE) on Developments in the Field of Information and Communications Technology in the Context of International Security dedicated an entire section to the issue in its 2013 report. While for some, the section might have been a mere “throw-away” to reach consensus on other parts of the report, it identifies what state and non-state actors alike can do to overcome some of the issues undermining trust and driving insecurities with regard to cyberspace.

The 2013 GGE Report, combined with a range of capacity building programming by individual states (e.g. Australia, United States, the United Kingdom, Russia, Nigeria, Korea, Kenya) and multilateral institutions (e.g. the EU, the UN, the Organization of American States) is driving deeper engagement with, and oversight by civil society, academia, think-tanks and the private sector. Individually, these actors have niche expertise. Collectively they have the ability to build the needed capacity in areas ranging from international security and diplomacy to basic IT hygiene and incident response.

While the number of actors currently engaged in cybersecurity capacity building might appear confusing and chaotic, confusion and chaos are not necessarily negative outcomes, especially at this early stage of our understanding of cyberspace and cybersecurity. The chaos and potential duplication represent the presence of an important and vigorous debate, which is producing a broad marketplace of ideas. Hence, suggestions and initiatives emerging from this wild and fertile terrain should not be disregarded.

At the same time, it is increasingly clear that we need to start separating the wheat from the chaff and develop some form of coordination and coherence of effort. Some of these efforts clearly push certain interests and values, but just as many do not. In the end, the recipients of capacity building efforts will weigh the pros and cons of the support provided.


One particular initiative is worth watching.

Coinciding with this week’s GGE meeting, the Dutch government is hosting the Global Conference on Cyberspace, an annual meeting launched by the U.K. government in 2011. The Dutch will use the opportunity to launch a Global Forum on Cyber Expertise for capacity building in the fields of cybersecurity, cybercrime, data regulation and e-development. There is also a historical echo as some 185 years ago, The Hague commissioned its first high speed communication network—a version of the shutter telegraph connecting the country’s capital to Breda and then to the navy base at Vlissingen as the Belgian War of Independence broke out.

The general idea behind the initiative is that states with limited capacity (policy or technical) will be able to draw from a pool of expertise (public and private) in other more technologically-sophisticated countries to improve their ability to address a cyber-related challenge.


As with many capacity building initiatives, the devil will be in the details. For example:

How will the Global Forum be sustained in the long term?

How will the Global Forum capture efforts that often fall below the radar yet are important to some actors and less so for others, such as regional initiatives?

Should the funding for these activities be channeled through traditional official development assistance, underpinned by existing principles and integrated into national development plans?

Will it include a focus on monitoring and assessing the effects of existing capacity building activities, an issue that receives scant attention?

How will it ensure capacity building efforts are not implemented in a governance vacuum?


Answering these questions and tailoring capacity building efforts accordingly will determine a country’s ability to thrive in the digital age. This is where the drudge work comes in, and again, much of it has to do with good governance.

Stealing from Dutch Foreign Minister Albert Koenders’words in a recent address, it is perhaps time to delve into the annals of history and dust off the writings of Machiavelli. Not, as highlighted by Koenders, the Machiavelli who said “the end justifies the means,” but rather the one who philosophized about the meaning of good governance:

Good governance as it has been taught for centuries, but which now has to be applied in an age of globalization, revolutionary technology and, fortunately, more vocal people. Good governance requires constant renewal of the social and political contract, combined with reform.

Unless we ensure that efforts to build capacity in this area go hand-in-hand with that “constant renewal of the social and political contract,” we may very well end up hoisted by our own political and social petard.

As the diplomats on the GGE draft their next report, they should pay even closer attention to cybersecurity capacity building. They should signal that capacity building, dull as it may appear, is not an end in itself. Rather, it is an important part of the broader processes of political and social change going on around us.


Congress wants the Pentagon to develop a long-range, deadly superdrone

By Christian Davenport April 10 


The revolution in unmanned aerial flight has advanced quickly and in dramatic leaps, perhaps none more momentous than the historic landing of an autonomous Navy drone on an aircraft carrier.

That happened for the first time two years ago, when Northrop Grumman’s X-47B, which looks more like a UFO than a military aircraft, took off and landed on the USS George H.W. Bush.

It made history again last year, when it flew alongside F/A-18 fighter jets, the first time a drone flew in concert with piloted planes off a carrier. And later this month, the X-47B is expected to attempt what would be another aerial triumph: refueling midair by a tanker plane.

But despite those breakthroughs, some powerful members of Congress and leading military think tanks say the Pentagon is being too cautious in its development of a technology that they think could push the boundaries of unmanned flight—and the future of warfare.


In what has become a made-for-Washington drama, a group of Congress’ most influential members are pushing the Pentagon to develop what to some sounds like sci-fi fantasy: drones that could not just take off from carriers, but fly for days at a time, covering hundreds, if not thousands, of miles, and perhaps most importantly, haul a hefty arsenal of bombs deep into enemy territory.

That might sound like a tall—and expensive— order, especially at a time when congressionally mandated budget caps are forcing the Pentagon to balance wish lists against necessities.

But proponents of a more ambitious approach, including Sen. John McCain (R-Ariz.), fear that the Navy’s plan to primarily use the drones as eyes in the sky designed to fly around the carriers and detect incoming threats, is misguided. Potential adversaries have developed advanced anti-ship missiles, pushing aircraft carriers well off shore, beyond the striking distance of piloted fighter jets, such as the F/A-18 and the F-35 Joint Strike Fighter.

To counter that, the carriers need stealthy, unmanned aircraft that can fly long distances, in a combat environment, they say. Without that ability, the U.S.’s long-held air dominance is lost and its fleet of aircraft carriers, long a symbol of the U.S.’s ability to project power from wherever, whenever, is at risk of obsolescence.


“If you take the most elementary missiles the Chinese have developed they have already pushed our carriers to a standoff of about 1,000 miles,” said Rep. J. Randy Forbes (R-Va.). “If our carriers get set back that far the F/A-18 and the F-35 are not going to get us where we need to go.”


[Meet the most fascinating part of the F-35: The $400,000 helmet]


In a letter last month, McCain, chair of the Senate Armed Services Committee, urged Defense Secretary Ashton B. Carter “to ensure that the Navy’s first unmanned combat aircraft is capable of both providing persistent ISR [intelligence, surveillance and reconnaissance] and conducting strike missions from the carrier at standoff distances in contested environments.”


But defense consultant Loren Thompson said that if the program becomes too expensive, the Navy might decide it can’t afford to pursue it.

“If you ask for too much, it either won’t work or it will be the size of a B-52 Bomber,” he said. “Every time Congress tries to add another requirement to this program, it undercuts support in the Navy for what was supposed to be a relatively modest effort.”

He also said that the Pentagon has plenty of aircraft that can attack from long distances, and noted that it is close to awarding a contract for what’s known as the Long Range Strike Bomber, which would have nuclear capabilities.

Last year, Congress held up funding for the X-47B program until the Pentagon figures out what it exactly it wants. And now the Defense Department is doing a review that would determine whether it should be able to strike at long distances or primarily a surveillance aircraft.

Bryan Clark, a senior fellow at the Center for Strategic and Budgetary Assessments, predicted that the review would find that there are “a lot of other aircraft that can do surveillance around the carrier.”

While creating a drone that can fly long distances, while also carrying a heavy weapons payload, sounds like a massively complicated endeavor, he said that the X-47B “has been so successful it seems like building an aircraft like that is not as risky a proposition as we had thought.”

Forbes said he was glad the Pentagon was taking a deliberate approach. The decision it makes “will lock us in to the direction we will go for the next 20, 30 years,” he said. “That’s why it’s so important …we make the right decision now.”


America’s Military Is Dangerously Obsolete

“The Pentagon’s unchanging force structure is simultaneously wasteful and miserly.”

Robert Haddick

April 10, 2015

In a speech on February 18 to the Chicago Council on Global Affairs, likely presidential candidate Jeb Bush declared “the next president should launch a strategic review of our military strategy so that we meet 21st century needs with a 21st century defense capability. Then we should fund that strategy.”

We should expect virtually all of the forthcoming U.S. presidential candidates to make a similar declaration. Come January 2017 there will almost certainly be a new leadership team at the Pentagon that will want to implement its views on defense strategy.

A new start after the election will also raise hopes for a resolution to Washington’s budget war, which Bush also called for in his speech.

Bush’s call for “a 21st century defense capability” implies he doesn’t believe the United States currently has such a force. Supporting Bush’s assertion is a force structure that, in spite of dramatic changes in technology and the geostrategic situation, has been largely frozen for nearly two decades.

The 2014 Quadrennial Defense Review’s lists the “objective force,” the planned combat force for 2019. It is remarkable how similar this force is to the force structure from 2001 (found in the 2001 QDR). Active duty ground combat forces are virtually identical: ten U.S. Army combat divisions, plus three Marine Corps divisions and aircraft wings in 2001 and 2019. There are also few changes for the Navy over this long interval: one less aircraft carrier in 2019; 51 attack submarines in 2019, down from 56 in 2001; and 92 cruisers and destroyers, an increase from 81 in 2001. The most notable change for the Navy is the steep reduction in amphibious ships, from 41 in 2001 to 31 in 2019. Active duty Air Force fighter squadrons were also cut from 46 in 2001 to 26 in 2019, while combat-coded bombers were trimmed from 112 to 96.

During this 18-year span the Pentagon will increase missile defense and special operations capacity and stand up a significant number of cyber warfare units. These are important adaptations but constitute only a small fraction of the Pentagon’s total budget. Beyond these adjustments and the notable drawdown in the Air Force’s tactical fighter squadrons and the Navy’s amphibious shipping for the Marine Corps, it is remarkable how little else will change between 2019 and the post-Cold War, pre-9/11 force from eighteen years earlier.

One could scan the history of the U.S. military over the past century and struggle to find other cases of virtually matching force structures over two-decade spans. And for good reasons. New military technologies, along with rising and vanquished challengers, alter the strategic landscape, creating the need for innovation, adaptation, and a reallocation of defense resources.

The span between 2001 and 2019 will not lack for profound strategic changes. The most important challenge to the design of U.S. military forces and their long-standing operational concepts is the rapid proliferation of long range precision anti-ship and land-attack missiles and the sensors and command networks directing them.

This development, most notably exploited by China but also by other potential adversaries, is undermining the central assumptions behind the fundamental design of U.S. expeditionary forces and their plans for employment. For decades, U.S. policymakers and commanders have assumed that they would have access to bases on the periphery of Eurasia and that those bases and naval task forces would be largely free to project power at will. The curtain is falling on that era, a watershed development that has yet to register with the Pentagon’s force structure or plans.

The period has also witnessed the rapid rise of well-armed and organized non-state actors, both those with political objectives (such as al Qaeda and ISIS) and those strictly commercial yet violent transnational criminal organizations whose capacity for subversion can in many ways be even more threatening to state authority. We have learned from recent experience that U.S. conventional military forces struggle to be suitable and sustainable matchups against these aggressors.

Russia’s employment of hybrid warfare in Eastern Europe is similarly flummoxing policymakers and military planners, another example of a smart adversary exploiting the design problems and operational limitations of legacy conventional forces and concepts.

It is increasingly difficult to argue that the unchanging U.S. force structure over this two-decade period of rapid change is a smart “all-weather” portfolio of capabilities, well-suited for the challenges of both 2001 and 2019. Instead, adversary adaptations have undercut the value of significant portions of the U.S. force structure.

The unchanging force structure has resulted in expensive overstocking in many assets that potential adversaries, using tools such as irregular warfare or precision missiles, will be able to counter, reducing or negating their battlefield utility and deterrence value. Meanwhile, the quantities of other force structure elements, such as submarines, long-range airpower, special operations forces, and Marine Corps crisis response units, will fall well short of the demands made by regional commanders, especially if multiple contingencies break out. The Pentagon’s unchanging force structure is simultaneously wasteful and miserly, undercutting deterrence and increasing strategic risk.


The Office of the Secretary of Defense and the Joint Staff at the Pentagon have elaborate force structure planning processes. But in spite of this army of smart and experienced strategists, numerous institutional barriers to reform have frozen resource adaptation.

Some defense analysts have hoped that the Pentagon’s budget crunch would force creativity and innovation as it did during the two decades before World War II. That austere period saw the U.S. Army divert substantial resources from its traditional ground troops to airpower and especially strategic bombers, then a controversial concept. The Navy took funds that could have gone to the battleship “gun club” and spent it instead on aircraft carriers and aviation, submarines, and Marine Corps amphibious experimentation. When the U.S. entered World War II it had the doctrine, training, and operational concepts in place to dominate each of these recently emerging fields.

The lack of a vigorous adaptive response today was enough to cause recent Defense Secretary Chuck Hagel and his deputy Robert Work to launch the Defense Innovation Initiative, with an aim of energizing the Pentagon’s adaptive technology, wargaming, and personnel policies.

Still, we have yet to see the major adaptive reforms and resource reallocations that were more common in the past. Breaking through institutional barriers to reform will require the next president to personally attend to the problem and to work with Congressional leaders and senior Pentagon officials to achieve reform. The president will also need to find military leaders who are committed to providing the most well-adapted forces to joint force commanders, regardless of which service, platform, or concept supplies the forces and needed capabilities. Effective adaptation will require a rigorous defense strategy process that questions previously unexamined assumptions and insists on operational concepts suited for the future, not the past. Only then will policymakers truly understand the 21st century defense capability they need to acquire. The Pentagon has adapted before and can do so again.

Robert Haddick is a former U.S. Marine Corps officer and an independent contractor at U.S. Special Operations Command. He writes here in a personal capacity. He wrote “Fire on the Water: China, America, and the Future of the Pacific,” published by Naval Institute Press.

Image: U.S. Air Force Flickr. 



The consortium led by ONERA has been selected by the ANR for the fight against drones

by Press • 15 April 2015


The project known as ANGELAS, which stands for ANalyse Globale et Evaluation des technologies et méthodes pour la Lutte Anti-uaS (French for Global Analysis and Evaluation of Technologies and Methods for Combatting UAS*), coordinated by ONERA with six industrial and academic partners, has been selected by the French National Research Agency (Agence Nationale de la Recherche, ANR) for the programme “Protection of Sensitive Areas from Aerial Drones”.

This experimental development project, which will last 18 months, falls within the framework of civilian applications for combatting non-cooperating UAVs. It consists in assessing the performance of various sensor technologies (radar, optronics, acoustics,etc.), in maturing some more recent technologies and in combining these detection, identification and neutralisation systems to respond specifically to various scenarios: overflight of sensitive facilities under all conditions, overflight in urban settings or when surveilling events (gatherings, sports events,VIP visits, etc.).


The ANGELAS project faces many major challenges:

– Achieving a greater expertise in the handling of drone electromagnetic, optronic and acoustic signatures;

– Improving these technologies: by adapting existing means to the constraints of this new type of threat, or proposing highly promising new technologies, such as optronics (2D/3D laser imaging, etc.), electromagnetism (passive and active radar) and acoustics;

– Studying the complementarity of the various systems and their related treatments to propose a relevant operational solution;

– Enhancing the range of avoidance and neutralisation solutions (jamming, tracking, etc.)

– Confronting the research with the experiments on site throughout the project.


The ANGELAS project is supported by a multidisciplinary public-private team, balanced between industrial partners (SMEs and large groups) and laboratories:

ONERA is coordinating the project and is contributing its expertise on “drone systems”, which includes its expertise in the field of sensors (optronic, electromagnetic and acoustic), detection pursuit approaches, classification-identification and multi-sensor data fusion. It also makes its testing website available to operate and test the drones in a controlled environment.

The consortium also brings together:

– Thales, Telecom SudParis, the CEA Leti and Exavision, which contribute sensing, geolocation and neutralisation resources to the project;

– EDF, which contributes its knowledge of the risks and needs, as well as a first feedback on the solutions proposed. As both a partner and end user, EDF will also coordinate the operational demonstrations at the end of the project, by mobilising its representative infrastructures;

– The Paris Institute of Criminology, which ensures the compliance of the new system with the applicable agreements, laws and regulations;

– A committee of end users, including the Air Force, the Navy, the Directorate General of the National Gendarmerie and the Paris Police Headquarters.


* UAS = Unmanned Aerial System


Pentagon: We Can’t Afford to Replace Aging ICBMs, Bombers, Subs

April 14, 2015 By Marcus Weisgerber


Even if sequestration goes away, the Defense Department will need at least $10 billion more per year to modernize its nuclear arsenal, procurement chief says.

The Defense Department cannot afford to replace its aging nuclear-missile submarines, ICBMs, and long-range strategic bombers unless it gets a funding boost or radical policy changes are made, according to a top Pentagon official.

Even if Congress approves the White House’s 2016 budget for the Pentagon, the Pentagon will find itself $10 billion to $12 short beginning in 2021, Frank Kendall, undersecretary for acquisition, technology and logistics, said Tuesday at the Navy League’s Sea Air Space conference.

“We don’t have a solution to that problem right now,” Kendall said, asked by a reporter about the plan for funding a replacement for the Navy’s Ohio-class submarines. All more than 30 years old, these “boomers” slip silently around the world’s oceans, waiting for the command to launch their nuclear-tipped ballistic missiles.

The entire U.S. nuclear inventory needs to be upgraded by the 2030s, Pentagon officials say. That includes the submarines, land-based Minuteman intercontinental ballistic missiles and new stealth bomber. Together, the three make up the “nuclear triad,” the cornerstone of America’s deterrent force. The Pentagon also says it needs a new nuclear cruise missile.

Right now, more money toward the end of this decade appears to be the only way to fix the problem, Kendall said.

“I frankly think the only way we can address it and keep the force structure in the Department of Defense that we would like to have is higher budgets,” he said. “I really don’t right now see any management solutions that will help us.”

Defense Secretary Ashton Carter has put together a Pentagon group, the Strategic Portfolio Review, to look at ways to address the funding shortfall and nuclear modernization.

Pentagon officials say federal budget caps, commonly called sequestration, already financially hamstring them.

The Air Force intends in coming months to pick a builder for its new bomber, of which it plans to buy 80 to 100 for a planned $550 million apiece. However, the aircraft is not expected to carry nuclear weapons when it is declared battle-ready sometime in the next decade.

Cost estimates for the Navy’s new Ohio-Class replacement submarine range from $5.5 billion to $8 billion each. The Navy plans to buy 12 new subs.

When the new ICBMs and new cruise missiles are included, the total price tag for the nuclear arms upgrade ranges from hundreds of millions to $1 trillion over the next two decades.

That leaves the Pentagon with an affordability problem that could force policy changes.

“There are radical policy changes you can talk about, like changing the nature of the triad, that would help significantly,” Kendall said.

While academics and military strategists often debate removing one of the legs to the triad, the Pentagon has stuck with the three in all of its strategy reviews in recent years.


Chinese ASAT Test Was ‘Successful:’ Lt. Gen. Raymond

By Colin Clark

on April 14, 2015 at 6:34 PM


COLORADO SPRINGS: We’ve known for some time that China conducted an anti-satellite test July 23 last year, but we learned today that that test was “successful” even if it didn’t destroy anything.

China has successfully placed low earth orbit satellites at risk, Air Force Lt. Gen. Jay Raymond told an overflow audience at the annual Warfighters Lunch at the Space Symposium here. “Soon every satellite in every orbit will be able to be held at risk,” the head of the 14th Air Force said.

China has claimed the test was for missile defense and noted that nothing was destroyed in the test. Raymond clearly wanted to dispel that impression and make certain everyone in the space community knew that China had executed another ASAT test and that it had worked.

In other news, Raymond told the lunch that, for the first time, a foreign military officer has been placed in command of US troops at the highly classified Joint Space Operations Center, the nerve center of America’s military and spy satellites. (The NRO operates America’s spy satellites, but the JSPOC watches all.)

Until the last decade, foreign officers were not incorporated into most of America’s space facilities. Then we began bringing in Canadian, British and Australian officers to Strategic Command and Air Force Space Command, culminating in the appointment of a Canadian officer as head of a shift at the JSPOC.

Raymond told me after the lunch that he didn’t want to oversell the importance of the move, but he did call it “significant.” This is part of a longer term commitment by the United States to work with allies on space issues. Last year, Australia, Britain, Canada and United States signed a symbolically important Memorandum of Understanding committing them to “a partnership” on combined space operations. “We have got to develop the partnership with our allies. We can’t do this alone,” Raymond said.


Investigators warn airplane computers could be hacked

Wed, 04/15/2015 – 9:15am

by ANNE FLAHERTY, Associated Press


WASHINGTON (AP) — The same Internet access now available on most commercial flights makes it possible for hackers to bring down a plane, a government watchdog warned Tuesday.

The finding by the Government Accountability Office presents chilling new scenarios for passengers. The report doesn’t suggest it would be easy to do, or very likely. But it points out that as airlines and the Federal Aviation Administration attempt to modernize planes and flight tracking with Internet-based technology, attackers have a new vulnerability they could exploit.

A worst-case scenario is that a terrorist with a laptop would sit among the passengers and take control of the airplane using its passenger Wi-Fi, said Rep. Peter DeFazio, an Oregon Democrat on the House Transportation and Infrastructure Committee who requested the investigation.

“That’s a serious vulnerability, and FAA should work quickly” to fix the problem, DeFazio said.

The avionics in a cockpit operate as a self-contained unit and aren’t connected to the same system used by passengers to watch movies or work on their laptops. But as airlines update their systems with Internet-based networks, it’s not uncommon for Wi-Fi systems to share routers or internal wiring.

According to the report, FAA and cybersecurity experts told investigators that airlines are relying on “firewalls” to create barriers. But because firewalls are software, they could be hacked.

“According to cybersecurity experts we interviewed, Internet connectivity in the cabin should be considered a direct link between the aircraft and the outside world, which includes potential malicious actors,” the report states.

The GAO released a separate report last March that determined the FAA’s system for guiding planes and other aircraft also was at “increased and unnecessary risk” of being hacked.

One area of weakness is the ability to prevent and detect unauthorized access to the vast network of computer and communications systems the FAA uses to process and track flights around the world, the report said. The FAA relies on more than 100 of these air traffic systems to direct planes.



California’s Drought Grabs Headlines, But Other States Face Water Woes Too

April 16, 2015

By Elaine S. Povich


Skiers last month cross a creek in a part of Sequoia National Park that is usually covered by feet of snow through March. California’s drought has attracted the most attention, but many other states are facing similar problems. (AP)

With all the attention focused on California’s water woes, an observer might conclude that the Golden State’s drought is the exception. It isn’t. Forty states expect to see water shortages in at least some areas in the next decade, according to a government watchdog agency.

In a 2013 survey by the Government Accountability Office (GAO), state water managers from around the country said they expect freshwater shortages to continue into the next decade, even under what they described as “average” conditions. If those conditions change—whether because of rapid population growth, unusually low snowfall or rainfall, or accelerated economic growth—the situation could worsen.

“As far as other states, if they haven’t seen it in the past, it’s something they will see in the future,” said Ben Chou, a water policy analyst in the Los Angeles office of the Natural Resources Defense Council, an environmental group.

Late last month, Democratic California Gov. Jerry Brown implemented mandatory water conservation rules. The rules ban all restaurants, bars and hotels from serving water unless customers ask for it, ban the watering of lawns and landscaping within 48 hours of measurable rain, and require municipalities and private companies to limit lawn watering to two days a week.

The lawn-watering limits are likely to have the most impact, since outdoor irrigation makes up 44 percent of water use in the state’s urban and suburban communities, according to the state water board. The past three years have been the driest three years in California history dating to the 1849 Gold Rush, the board said. Low snowpack, combined with 2014 being the hottest year in history in the state, exacerbated the situation.

Residents of other states should take a lesson from California, Chou said. “Attention on California is due to the size of the state and the fact that we grow about half of the nation’s produce,” he said. “Other states have started looking at it, especially where water supplies are at a premium.”

Montana, for example, was listed in the GAO report as the state most likely to have a statewide water shortage in the next decade. Many other western states are in a similar predicament.

Tim Davis, Montana Water Resources Division administrator, said his department told the GAO that in any given year, any part of the state could have a water shortage. He said that last year, most of the state was fine, but the southwest corner was dry. Entering this irrigation season for farmers, Davis said the southwest region remains dry as is “much more of the state.”

Under direction from the Montana Legislature, Davis said his department has begun to plan for water shortages.

“Drought is one of those disasters that you have to plan for,” he said. “You can’t just immediately go out there and change how you’re using water on the ground or invest in efficiencies unless you have been doing it all along.” He said the state is making plans to share water between communities during times of drought, along with changing field irrigation methods to save water.

About 36 percent of Texas is experiencing moderate or exceptional drought, according to Dr. Robert Mace, deputy executive administrator of the Texas Water Development Board. That actually represents an improvement. In 2011, 100 percent of the state was experiencing drought, the first time that had happened since the state started keeping records in the 1880s.


The crisis grabbed the attention of Texas lawmakers, and in 2013 they sent a referendum to voters calling for $2 billion of the state’s rainy day fund to be used to leverage $27 billion in bonds to implement a state water conservation plan. The referendum passed with 74 percent of the vote.

Mace said the plan is being implemented as water demands in the state continue to grow. He said the population of Texas is expected to increase by 80 percent in the next 50 years. “We are working with water providers, water managers, to try to plan for enough water in case of a repeat of the ‘drought of record,'” he said. “We are taking a cold, sober look at what the needs (for) water are going to be.”

The plan includes allowing bordering communities to tap water from each other, and state money for emergency well drilling. “The communities that really struggle with drought are the smaller communities,” he said. “(State officials) will go out and make suggestions on how to run their systems more efficiently.”


Money Troubles

In Kansas, drought conditions also are a little better this year. Last year, 93 percent of Kansas was facing severe drought, according to the U.S. Drought Monitor, but this year things have improved slightly, according to Tracy Streeter, director of the Kansas Water Office. The improvement is mostly due to more rainfall, which varies so much from year to year the state is forced to plan for the worst. The worst recorded drought was in 1952-57. “We try to benchmark to that point,” Streeter said.

In a report to Republican Gov. Sam Brownback completed in January, the water office, along with other agencies and citizen groups, painted a bleak picture. The report notes that the Ogallala Aquifer is declining faster than it is recharging, and that reservoirs are filling with sediment. If nothing changes in the next 50 years, the report predicted, the Ogallala will be 70 percent depleted and the state’s reservoirs will be 40 percent filled with sediment.

The group called for the creation of a water resources subcabinet to advise the governor and a task force to develop financing for water resource management, including alternatives that utilize public-private partnerships.

Streeter said the subcabinet has been created and will have its first meeting soon. But Kansas’ struggle to fill an estimated $340 million budget gap has stalled progress on the task force recommendation that the state find money to implement water conservation plans.


East Not Immune

While water problems are worst in the West, the East is not immune. According to the GAO report, North Carolina and Delaware are the eastern states most likely to experience regional water shortages in the next decade.

Kansas’ Streeter says more states in the East are asking for his advice as they prepare for a future with less water. “Drought is insidious, it doesn’t land on you overnight, it creeps up on you,” Streeter said. “Agriculture feels it first—you can start having crop failures within weeks of having no rain. Everybody else takes months.”

According to the U.S. Drought Monitor, the western part of North Carolina is already “abnormally dry,” though it doesn’t yet meet the “drought” definition. Sarah Young, a spokeswoman for the North Carolina Department of Environment and Natural Resources, said state officials talk with the federal drought monitoring agency every week to assess the situation in the state.


It has been several years since North Carolina experienced drought conditions, but in 2007 and 2008, it suffered the worst drought in its history. That prompted a state law expanding the powers of the governor and local officials to respond. The law also encourages conservation and cooperation among communities. North Carolina is experiencing population growth and changing weather patterns that “leave us vulnerable to more serious droughts in the future,” according to the state water plan.


AT&T sues the FCC over net neutrality rules

Apr 15, 2015, 11:55am CDT Updated: Apr 15, 2015, 1:46pm CDT

Danielle Abril

Staff Writer-

Dallas Business Journal


AT&T is the one of the first companies to sue the Federal Communications Commission regarding its recent rules on net neutrality.

The Dallas-based company (NYSE: T) filed a lawsuit Tuesday, adding its name to a list of lawsuits from associations including the National Cable and Telecommunications Association and CTIA – The Wireless Association.

AT&T, like the trade groups, specifically takes issue with the FCC’s reclassification of the Internet as a telecommunications service.

This would subject it to the same rules and oversight as the phone system.

Previously AT&T supported the FCC net neutrality rules in 2010.

The lawsuits have been surfacing since the FCC’s net neutrality rules hit the Federal Register on Monday.



Why do the Chinese Hack? Fear

Enrique Oti    

April 16, 2015


The Chinese government is scared of the Internet. They are scared of the foreign ideas that it brings into China; they are scared of how it enables the Chinese people to spread knowledge about government corruption; and they are especially scared of how it was used during the “color revolutions” and the “Arab Spring.” For the Chinese Communist Party (CCP), the Internet is a clear threat to its legitimacy, its monopoly on power, and its survival.

The leadership fears the United States and its allies use the Internet as a strategic weapon. This is compounded by fears of other existential threats such as a potential failing economy and foreign intervention in Chinese sovereignty and territoriality issues. Paradoxically, the solution to these other threats is the rapid development of the Internet and Internet technologies. How the Chinese leadership is addresses these competing aspects — controlling the Internet, while also expanding its development and use — is having a major impact on the United States.

To ensure its survival, the CCP has decided that it must control the Internet it has, while building the Internet that it wants, and cyber espionage (hacking) is the most effective tool for achieving both. Chinese President Xi Jinping laid out the importance of the Internet to China’s survival during an Internet Security and Informatization Leading Group meeting when he stated, “No Internet safety means no national security. No informatization means no modernization.” Through this Leading Group, President Xi is personally directing the future of the Internet in China, creating comprehensive cyber strategies that address internal and external threats.

The strategy for controlling the Internet within China is best described using the words of the China Internet Conference: “mutual building of a favorable environment.” While China’s ability to filter content using the Great Firewall is the most overt aspect of this strategy, the Chinese also employ a wide range of influence measures to coerce their population to act in favorable ways. Proper online behavior is encouraged with education and propaganda, although some efforts, like this song, may not exactly hit the mark. This soft approach is surprisingly effective at encouraging the Chinese people to self-censor, but is also reinforced by harassment and arrests for activities contrary to the government and its policies, supportive of mass organization, or promoting the spread of rumors. The search for Chinese dissenters does not just affect Chinese citizens. To track down and arrest dissidents, Chinese hacking targets include U.S. email providers, journalists, and NGOs. As the use of their “Great Cannon” is showing, they have no reservations about launching prolonged denial of service attacks on major U.S. websites to limit their citizens’ access to unacceptable content. The Chinese government fears dissent, and will hack any entity they think supports or enables the undermining of the Communist Party.

The “favorable environment” strategy enforces internal security, but the CCP must also ensure development and prosperity to maintain its legitimacy. The United States experiences recessions and depressions without the risk of political collapse, but for China an economic slowdown would be disastrous to the ruling elites. To help keep the economy moving forward, the CCP released an Internet (informatization) development strategy in 2006 that emphasizes “indigenous development” and looks to replace foreign technology with Chinese products. The move to indigenous Internet technologies ensures that foreign technologies cannot be used to threaten Chinese security, but more importantly, it generates a new industry to provide employment and generate revenue. On the surface, this is entirely consistent with legitimate capitalism and competition, but as U.S. industries have learned, “indigenous development” is not truly indigenous (see this book for details). The hackers are from China, but the intellectual property, trade secrets, and research clearly originate from the United States and foreign companies that have had their computer networks exploited by the Chinese government, despite countless U.S. protestations.

The extensive economic espionage used to drive internal development also serves a secondary purpose of modernizing the Chinese military. The Chinese fear a repeat of their “Century of Humiliation,” when Chinese territory and sovereignty was torn apart by foreign intervention. In the modern era, China fears U.S. intervention in its “internal” affairs such as Taiwan, the Senkaku islands, and the various islands and rocks in the South China Sea claimed by Vietnam, the Philippines, and others. The People’s Liberation Army (PLA) is addressing this fear through a modernization strategy that directly counters the technological advantage of the U.S. military. The PLA is modernizing its weapon systems to compete with the United States, while at the same time developing cyber, electronic warfare, and space weapons to counter the U.S. military’s asymmetric advantages in those areas. The persistent hacking of the U.S. defense industry and Defense Department gives the PLA the critical data it needs to achieve its goals, and puts the ability of the United States to respond to a regional crisis at risk.

What does this mean for the United States? The widespread hacking that concerns the United States is not merely an attempt by China to gain economic advantage; to the Chinese government, it is essential to survival. Fear is a powerful motivator, and the CCP has determined that cyber security and development strategies, enabled by hacking, are the only way forward. The United States may protest China’s cyber activities and encourage them to take another path, but unless Chinese fears can be alleviated, or be replaced by an even greater fear, the Chinese have no reason to change a strategy that seems to be working in their favor. The Chinese hack because they need to, and with no room for compromise, the United States can expect extensive hacking to continue, and probably increase, in the foreseeable future.

Enrique Oti is an officer in the United States Air Force and a National Security Affairs Fellow at the Hoover Institution. He can be followed on Twitter at @Enrique_Oti. The views expressed in this paper are those of the author, and do not reflect the views of the United States Air Force or the Department of Defense.


Rasmussen Reports

What They Told Us: Reviewing Last Week’s Key Polls

Bottom of Form

Saturday, April 18, 2015


For the majority of Americans, the five-letter word “taxes” is dirtier than most words of the four-letter variety.

After all, most voters in surveys for years have called for a smaller government with fewer services and lower taxes over a more active one with more services and higher taxes. 
Voters strongly distrust the way the federal government spends tax dollars and think there’s not enough oversight of how that money is spent.

Forty-seven percent (47%) of Americans believe filling out their income tax paperwork is worse than taking a trip to the dentist’s office.

Still, most were on schedule to file their income taxes by last Wednesday’s deadline, with half expecting a refund. But over half think they are paying more than their fair share.

Americans feel more strongly than ever that the middle class pays a larger share of their income in taxes than the wealthy do.

That’s why political candidates routinely promise to help the middle class with tax cuts. But most Americans think those candidates don’t even agree on who makes up the middle class, and they don’t believe their promises of cutting taxes. 

Just the opposite, in fact. President Obama in this year’s State of the Union address proposed $320 billion in tax increases on the wealthiest Americans, but most voters suspect this just means more taxes on the middle class as well.

A lot can happen in the nearly 19 months between now and Election Day, but right now voters agree the president and the Republican-led Congress are a problem for the presidential candidates of their respective political parties.

The president’s daily job approval ratings remain little changed.

But then looking ahead to next year’s presidential contest, voters expect more of the same: two candidates with whom they have very little in common.

Most voters think Hillary Clinton will be the next president of the United States, even though they have very mixed feelings about her.

Clinton leads the first two announced candidates for the Republican nomination, Ted Cruz and Rand Paul, but earns less than 50% support at this early stage.

Voters are less convinced that Bill Clinton will be a plus for his wife’s bid for the White House.

Senator Marco Rubio is the third Republican to officially announce his candidacy for president, and GOP voters think he has about the same shot as his announced rivals to win the party’s nomination.

Republicans and Democrats are tied on the latest Generic Congressional Ballot.

Voters are increasingly critical of the health care they get and predict it will get even worse under the new national health care law.

In other surveys last week:

— Twenty-nine percent (29%) of Likely U.S. Voters think the country is heading in the right direction.

— Nearly three-quarters of voters rate the quality of drinking water in their area as good or excellent

— While California is making headlines with its mandated reduction in water usage, Americans are overwhelmingly confident in their own water supply. But many question whether their local governments are doing enough to protect it. 

Voters remain critical of the nation’s public schools and still strongly favor giving parents choices when it comes to their children’s education. 

— Americans still have a very high opinion of teaching, but more than ever say it’s not a job most people consider pursuing.

Looks like the North and South still don’t see eye-to-eye on the Civil War 150 years after it ended, but one-out-of-three Americans don’t even know when that cataclysmic conflict took place.


From → Uncategorized

Comments are closed.

%d bloggers like this: