Skip to content

April 26 2014

May 2, 2014




DoD Reshapes R&D, Betting on Future Technology

Apr. 20, 2014 – 05:03PM | By ZACHARY FRYER-BIGGS


WASHINGTON — Defense budgets had been in decline for a decade when soon-to-be-president George W. Bush laid out his vision for the US military. In a 1999 speech, Bush argued that it was time for military research and development efforts to pursue big leaps, not incremental improvements.

“We will modernize some existing weapons and equipment, necessary for the current task,” he said in a speech at The Citadel. “But our relative peace allows us to do this selectively. The real goal is to move beyond marginal improvements — to replace existing programs with new technologies and strategies. To use this window of opportunity to skip a generation of technology.”

That era of relative austerity ended two years later, when the Bush administration launched two wars and shifted spending to more immediate battlefield needs. But with defense budgets once again in decline, there are remarkable parallels between Bush’s 1999 vision, outlined at the military college In Charleston, S.C., and Pentagon leaders’ R&D plan for the next few years.

Overall, DoD wants to keep spending on RDT&E — research, development, test and evaluation — relatively close to the $63 billion the department will spend in 2014. That’s about $36 billion less than the amount that will be spent on procurement in 2014. But under the president’s 2015 budget proposal, that gap would close to about $26 billion next year, according to data compiled by As pressure increases on defense spending, leaders are trying to protect research and development funding.

But look closer. Within that flat RDT&E budget, a radical shift is underway. Under the 2015 Future Years Defense Plan, DoD would halve spending on System Development and Demonstration, taking it from about $20 billion in 2009 to below $10 billion by 2018.

SDD is one of the seven categories of RDT&E spending, which move in rough order from basic scientific research to operational testing. SDD is Category Five, which funds efforts to turn ideas and prototypes into produceable, deployable weapons and gear.

The goal is to ensure that funding keeps flowing to basic research, which yields technology breakthroughs, and to early-stage development, which proves concepts with prototypes, the Pentagon’s R&D chief explained in an interview last fall.

The trade-off is that DoD will likely mothball many promising new technologies.

“We’re going to be asked to create more prototypes, but then not field them, to put them on a shelf,” said Al Shaffer, acting assistant secretary of defense for research and engineering.

The 2015 budget also keeps money in Categories Six and Seven, allowing the military to finalize the near-term advances that will tide the military over until a new wave of funding allows DoD to field technologies now in embryo.


New Lines

To see how new money is flowing to these last-stage categories, look at the new lines for RDT&E in the 2015 budget request. Much of the $3.2 billion in these new lines would go to help test major ongoing programs: $874 million for the US Navy’s replacement for its Ohio-class ballistic-missile submarines, $197 million for Littoral Combat Ship mission modules, $214 million for the US Air Force’s Space Fence, $113 million for the US Army’s WIN-T battlefield network, $145 million for the Advanced Missile Defense Radar.

The new-line total is double the $1.6 billion approved by Congress for 2014, which added more than $300 million to DoD’s request, largely in ballistic missile defense.

“That sounds like an attempt to finish off some programs to get the right marginal return out of the additional dollars you’ve got to invest,” said James Hasik, a senior fellow with the Atlantic Council. “It looks like you think you’re not going to undo sequestration, and you’re preparing for that enforced peace dividend. You’re going to cap a couple of things off, and then you’re going to wait to see when you have money.”

Once the current batch of systems has emerged from development, there aren’t a whole lot of big programs on the horizon. The Long Range Bomber program is being protected, largely as a boon to aircraft design teams who might otherwise atrophy, but much of the next era of R&D will likely take the shape Shaffer described: development of technologies that will be shelved.

But in the Pentagon’s model for future spending is an effort to push for leap-ahead technology, the type that Bush envisioned in 1999. That type of technology requires taking a bit more risk, something to which acquisition officers are notoriously averse.

DoD leaders are trying to change that, especially in the service branches.

“I think we have to be a little bit more risk-tolerant,” Shaffer said in a March 4 speech, the day the White House sent its 2015 budget request to Congress. “How we get there, I don’t know. One of the ways that we can get there a little bit is by, and we’ve seen a trend in this direction, protecting the investment in the places that tolerate more risk and failure, DARPA [Defense Advanced Research Projects Agency], at the expense of the service programs.”

The 2015 budget request for DARPA was roughly 5 percent over 2014 enacted levels.

“If the services don’t begin to do more high risk, then I think we will continue to see greater investment percentage in DARPA,” he said.

That emphasis on taking more risk isn’t limited to DoD funds. Pentagon leaders have been pressing the defense industry to invest more of its own dollars for a couple of years now.

That’s a message that hasn’t always been well received.

“If you’re only thinking about your quarterly report, which unfortunately our system tends to encourage people to do, you’re not going to think about this,” acquisition chief Frank Kendall said in a January interview. “I was at a company [in the 1990s] that understood the importance of R&D for its long-term health, and despite the drawdown — and we did take some cuts — there will still be an emphasis on internal research and development and preparing for the next generation. I would hope that there’ll be a number of CEOs who feel that way.”

So far, there’s not much evidence of that. From 1999 to 2012, top defense contractors cut their company R&D spending by roughly one-third, according to an analysis by Defense News last year. While there have been a couple of signs that companies are starting to reconsider, broad reversal of the trend has yet to surface.


Making Testing Easier

So DoD has been trying to make it easier for companies to test technologies in realistic settings. For example, the Navy has turned the Special Operations Command’s Stiletto craft into a testbed. Run by Naval Sea Systems Command, Stiletto offers a very fast-moving (nearly 50 knots) floating box, a platform on which companies can plug in different systems and test them out at sea.

The ship, which its program manager described as the largest composite-material vessel ever built at the time of its construction in 2005, has a UAV flight deck, a small boat launch ramp, and an array of ports and connections for testing different sensors and other gear.

In 2013, 56 different systems were tested on the ship, some of them brought on board as little as two weeks after a company contacted the program office. Roughly two dozen ended up advancing to the point of procurement. It’s a platform that DoD hopes can offer a lower-risk test bed for companies.

“This is a chance to get their things in the field, and if they fail, the companies can go back to the drawing board and improve them,” said Glen Fogg, director of the Rapid Technology Office in Shaffer’s office. “If you put something on a regular Navy ship and it fails, that’s the end of it. Here, they can use that information to improve the product.”

Several products that failed have returned to the ship improved, and have gone on to further success, Fogg said.

But riding with reporters during a recent outing on the Potomac River here, Fogg offered a reminder of the endeavor’s fragility.

Not long after a standard safety briefing before departure, he said that because there aren’t enormous funds for the program, if any significant damage occurred, even on a calm day, they likely didn’t have the money to fix the boat.

“This is a one-off; there is no Stiletto 2,” he said. “If there’s a major incident or funding cuts, we’re going to have to figure out what to do.”

That logic could be applied to the larger R&D plan, as it — like the total DoD budget — rests on targets that may not be attainable. The Pentagon is looking to save nearly $100 billion over the next five years by trimming various costs. In addition, its vision of flat R&D spending over the coming years requires that Congress raise the hard-fought spending caps by $115 billion.

Kendall and Shaffer have repeatedly argued that R&D needs to be treated as a fixed cost, a steady investment that can be leveraged when greater funds are available to field equipment during buildups, but can’t be abandoned in lean times. During the sequester, when accounts were uniformly cut by a certain percentage, it wasn’t possible to protect R&D at the expense of other areas. In future years, their ability to make trade-offs may be tested by worsening budget math.

The notion of keeping R&D flat isn’t new; the 1990s defense downturn saw a similar effort. In 1992, procurement outpaced R&D by $41 billion, but the following year, that difference dropped to $22 billion.

Just protecting R&D funding isn’t enough, Hasik said.

“There’s a difference between spending money and spending money smartly,” he said. “There are folks out in the world who make the argument that you have to spread money around the world wildly, because money spent on research is just good because it just leads to development. This is not a compelling argument because there are dead ends against which you can continue to apply money and not get very far.”

Making the right bets will be critical if the Pentagon is looking for leap-ahead technology.

There’s one major difference between what Bush wanted for defense and what the Pentagon is facing in its restricted fiscal future: Bush wanted to skip a generation of defense technology, but he was willing to increase spending to do it. Current leaders are unlikely, given the financial pressures in Congress, to have that luxury.




Study: U.S. Combat Aviation Stuck in the Industrial Age

By Sandra I. Erwin


U.S. combat air forces are ill equipped to fight a technologically empowered enemy, and it could be years or decades before the Pentagon deploys more advanced weapons. Such is the grim picture painted in a new study by the Center for Strategic and Budgetary Assessments.

The authors, retired Air Force Lt. Gen. David Deptula and CSBA analyst Mark Gunzinger, make the case that aviation forces are not up to the challenges of 21st century warfare and the Pentagon has only itself to blame.

“Fourteen years into the 21st century, the U.S. military is still living off investments in combat aircraft that were made prior to or during the Reagan administration,” Gunzinger told an Air Force Association forum in Arlington, Va.

For instance, the Air Force’s combat force primarily consists of aging A-10s, F-15s, F-16s, B-1s, B-52s, B-2s, and a handful of new F-22s. “Overall, the Air Force’s combat force is the smallest and oldest that it has ever fielded,” he said.

Shortsighted Pentagon budget decisions have weakened the aviation fleet, the authors contend. The United States Air Force only has a small number of its most advanced aerial weapons — the B-2 bomber and the F-22 fighter jet — and the next generation of systems is still years away. The Pentagon terminated production of the B-2 bomber in 2000 at 20 aircraft and the F-22 stealth fighter in 2010 at 187 airplanes. The thinking was that these aircraft were too expensive and soon would be replaced with more affordable alternatives. “Apparently this saved money,” Deptula said with sarcasm. In hindsight, the military is paying a big price for these decisions, he said, because new systems are far more expensive and nowhere close to being ready. “Numbers matter,” he said. The Air Force is buying new aircraft today, but most are cargo planes or unmanned surveillance drones. The military has more than 11,000 unmanned aircraft, but most are not equipped to survive enemy air-defense missiles.

Although no enemy air force has yet challenged the United States, the study predicts it is only a matter of time before the U.S. military is put to the test.

The risk posed by enemy technologies also applies to the Navy and Marine Corps, the study noted. The Corps continues to rely on non-stealthy AV-8B vertical/short takeoff and landing ground attack aircraft that were designed in the 1970s. The replacement F-35B Joint Strike Fighter is still in development.

“The Navy’s fixed-wing combat aircraft force is not as old as the Air Force’s because it is just completing its F/A-18 fighter program,” Gunzinger said. “However, the F/A-18 is non-stealthy, and the wisdom of deploying carriers within range of anti-ship ballistic and cruise missiles so their short-range fighters can reach their objective areas is doubtful at best.”

With the exception of the F-22s and B-2, the Pentagon’s fighters and bombers have “lost their ability to operate in high-threat areas without the risk of significant losses or the need for very large supporting force packages to suppress enemy air defenses,” the CSBA study said. “America’s recent focus on counterinsurgency operations has given China, Iran, North Korea, and other competitors breathing room to develop anti-access, area-denial (A2/AD) capabilities that could threaten U.S. access to areas of vital interest,” the report said. “The proliferation of guided ballistic and cruise missiles, anti-satellite weapons, cyber threats, integrated air defense systems and other asymmetric threats are intended to erode the U.S. military’s ability to effectively intervene in crisis situations.”

The term A2/AD is Pentagon-speak for an enemy’s ability to neutralize the traditional advantages of U.S. weaponry. Command and control networks may susceptible to electronic jamming. Air bases may be vulnerable to precision-guided missiles, and U.S. non-stealthy aircraft — manned and unmanned — may not be able to enter hostile airspace without risking major losses. “Enemy antiship ballistic and cruise missiles that are supported by space-based sensors and long-range surveillance aircraft may force U.S. aircraft carriers to operate a thousand miles or more offshore,” the study said.

To overcome enemy technologies, the U.S. military needs more than just new hardware, Deptula said. It needs its weapon systems to operate like a network, where information is shared across all services. The Pentagon has championed for decades the idea of “network centric” warfare, but in reality each service and each program operate independently, he said. “We’re in the era of information-age warfare, and we are having a bit of a challenge managing that transition,” said Deptula. “We need to think how we can better share information that turns into relevant knowledge and we need to do it automatically.” The Pentagon functions in budget-line items, not as an integrated enterprise, he said. “You need to get beyond the traditional labeling of systems, which is last century’s perspective. We need to think about how all systems in space, land and sea and air can operate in an integrated fashion.”

The need for information-focused weapon systems will be the subject of an 18-month study by the Air Force Association’s Mitchell Institute, which Deptula leads. The military should have for a “combat cloud” where information can be shared regardless of what weapon system is used, he said. “It is difficult to explain, and that is one of our challenges as it is not a ‘thing’ or even a collection of ‘things,’ but rather an approach.” Deptula told National Defense. Aircraft today are connected as sensors and shooters. “While this limited collaboration is positive, future developments in data sharing promise to dramatically enhance the way in which combat effects are attained as individual airborne assets are fully integrated with sea, land, space and cyber systems,” he explained. “Individual systems connected to the broader ‘combat cloud’ are able to leverage their respective strengths.”

To move its weapons into the 21st century, the Pentagon also needs help from Congress, Deptula added.

If the recent round of military oversight hearings on Capitol Hill is any guide, Congress is less worried about the modernization of the U.S. fleet than it is about protecting favored projects and jobs in members’ home districts.

Air Force leaders have argued that, in times of declining budgets, they cannot afford to continue to sink money into aging airplanes and should redirect funding to new systems such as the F-35, a refueling tanker and a long-range bomber. While the plan sounds reasonable in theory, it has turned into a political football. Air Force officials have been hammered by lawmakers for their proposal to retire the entire A-10 fighter force, 46 older C-130 aircraft and the entire U-2 reconnaissance aircraft fleet.

These budget quagmires only keep the military saddled with older technology, the CSBA study said, and contribute to the erosion of the nation’s manufacturing base. “Fifty years ago, the Defense Department was in the process of building six fighters, three bombers, and two antisubmarine warfare aircraft,” said the report. Today, there is one new American fighter in production — the F-35 — and three that are about to end their production runs. “With the exception of the Air Force long-range bomber, the Navy’s P-8 maritime aircraft, and possibly a carrier unmanned combat aircraft, there are no other major new combat aircraft in the Defense Department’s program of record.”


Report urges building resilience to future cyber shocks

Monday, April 21, 2014


Institutions worldwide must confront the high risk that stems from their dependence on information technology and build resilience to withstand future global shocks to the Internet — a point underscored by the recent Heartbleed vulnerability, according to a new report from the Atlantic Council and Zurich Insurance Group.

Although the Internet has long been resilient to attacks on a day-to-day basis, risk managers, corporate executives, board directors and government officials are not prepared for future cyber attacks that will significantly impact globally interconnected systems, states the report, “Beyond Data Breaches: Global Interconnections of Cyber Risk.” The assessment likens the looming problem to the subprime mortgage crisis.

The Heartbleed security flaw, which recently stunned security experts worldwide and has widespread implications, is a harbinger of future shocks, according to report author Jason Healey, the head of the Atlantic Council’s Cyber Statecraft Initiative.

“The recent Heartbleed vulnerability demonstrates the main message of the report,” he said in a statement. “The Internet is so complex and tightly coupled to the real world, it turns out we were all gravely exposed to a cyber risk in an obscure technology that few understand and we didn’t see coming. This time it was just passwords, but what happens once the internet is connected to the electrical grid or driverless cars?”

The report argues that the private sector should spearhead crisis management because government lacks the agility needed. It also calls for organizations with system-wide responsibility to plan on ensuring the stability of the system as a whole, as opposed to the individual organization. In addition, the report advocates having redundant power and telecommunications suppliers and alternate Internet service providers connected to different peering points; investing in trained teams ready to respond with defined procedures; and conducting simulations of the most likely and most dangerous cyber risks to better prepare.

Further, the study argues for cautiously using regulatory authority to expand risk management to third-party providers and affiliates. It cites the example of the U.S. Office of the Comptroller of the Currency’s 2013 guidelines requiring national banks to increasingly look for risk outside their own perimeters, particularly for “critical activities.”

“Other regulators can consider whether such a model — surely costly but which does address many of the aggregations of cyber risk — is appropriate for their sector,” the report states. “A regulatory focus on external aggregations of cyber risk can make sense for critical infrastructure sectors like finance, but probably not for retail or small- and medium-sized enterprises.”

The report advocates the use of cyber insurance, calling it an option that will become increasingly available to all companies and not only larger, more sophisticated organizations.

Recently completed surveys of corporate leaders show an “increasing level of awareness” about cyber risk and the related insurance product areas, Dan Riordan, CEO or Zurich Global Corporate in North America, said on April 16 during a panel discussion at the Atlantic Council.

Companies are looking more at cyber risks in their risk assessments, he said, noting about 90 percent of larger companies recognize the problem as a high or very high risk. The amount of companies buying cyber risk coverage has increased about 52 percent among the companies recently surveyed, which is an increase from about 35 percent four years ago, he said. The major risk to companies’ reputations is getting attention in the C-suite, Riordan said.


“I talked to a CEO recently who said ‘I don’t want to be in front of Congress like that other executive was recently,'” he said.

Some CEOs still simply defer the issue to the IT department, but that will increasingly change over time, he continued.

Catherine Mulligan, senior vice president with Zurich North America, cited different mindsets in different industries. Financial, healthcare and technology institutions, she said, “tend to be very clued in to these issues” and “they are highly regulated in a lot of cases, so there’s more awareness there then for example higher education, which comes from an ethos of let’s share information.”

Further, leaders of small and mid-sized companies still have a “disconnect” and face a steeper learning curve, she said. — Christopher J. Castelli (


Stop Asking the Military to Do More With Less

U.S. defense spending is shrinking, but demands on the military remain the same.

By Mackenzie Eaglen
April 18, 2014

Last week, the military’s vice chiefs told Congress that their ability to fight two wars at once was at risk. They warned that ongoing pressure, including from the 2011 Budget Control Act and its substantial defense cuts, is eroding the size and capability of America’s armed forces. As Army General John Campbell cautioned bluntly, “We’re mortgaging the future.”

While the vice chiefs are correct that fewer resources are having a profound and negative impact upon the ability of the Department of Defense to support the nation’s defense strategy, the unfortunate reality is that the military’s ability to fight and win two wars at once has been steadily eroding for the past 20 years under presidents of both parties.

The “two-war standard” has long been an important measuring stick for the military to roughly approximate the forces necessary to provide the most options to the commander in chief in response to questions of war and peace. The 1993 Bottom-Up Review articulated the clearest thinking behind this policy: “U.S. forces will be structured to achieve decisive victory in two nearly simultaneous major regional conflicts and to conduct combat operations characterized by rapid response and a high probability of success, while minimizing the risk of significant American casualties.”

Over the ensuing two decades, this standard was gradually wound down over successive Pentagon strategies. In 2002, for instance, then-Secretary of Defense Donald Rumsfeld announced that DOD was moving away from the two-war standard in favor of a more “balanced” approach that deemphasized occupation forces.

The Pentagon’s latest strategy continues moving the goalposts by calling for a force sizing construct designed to defeat one enemy while denying the objectives or imposing unacceptable costs on a second. The strategy’s murky language leaves it open to interpretation regarding how to impose “unacceptable costs” and the requirements to do so.

The Pentagon’s planning construct is important because only a military of a sufficient size and reach can carry out day-to-day missions such as disaster relief, regional deterrence and crisis response and a major campaign should the need arise.


The worry is that the U.S. military’s strategic aims are shrinking along with global presence and combat capabilities, but policymakers are not correspondingly reducing the military’s scope of responsibilities in support of vital national interests.

This growing gap between what the nation demands of the military and what its capacity, capability and readiness will allow, thanks to reduced budgets, will eventually lead to unacceptable outcomes and consequences, many of which will be borne uniquely by those in uniform and their families. The good news, however, is that these outcomes are avoidable should Washington’s leaders choose to reverse course and rebuild American military strength.


Revoked certificates cause issues after Heartbleed

Robert Lemos, Contributor

Published: 18 Apr 2014

Since the announcement of Heartbleed, a serious flaw that could allow an attacker to access passwords, encryption keys and other information sent to a server using OpenSSL, many companies have scrambled to patch their systems and revoke security certificates. While patching hundreds of thousands, if not millions, of systems will take time, the impact of the massive re-registration of SSL certificates and the wide use of revocation could be equally severe.

The concern is that the revocation mechanisms en masse have not seen this kind of test on them in such quantities.

When content-distribution firm CloudFlare updated its Secure Sockets Layer (SSL) certificates to protect its clients from the impact of the Heartbleed bug, that action alone caused minor palpitations across the Internet.

Roughly 50,000 keys were no longer trustworthy, and the size of the certificate revocation list (CRL) suddenly ballooned by more than a factor of 200 to nearly 4.7 megabytes, overnight. The CRL is one of two ways that browsers can check that an SSL certificate, widely used to secure communications between browsers and websites, has not been compromised. Just the bandwidth costs of distributing the new CRL file to browsers would likely surpass $400,000 dollars, according to CloudFlare’s calculations, while a check using Amazon’s Web Services put the figure closer to one million dollars.

Certificate authorities (CAs) such as Comodo, Symantec and Trustwave, are currently allowing companies to revoke and re-issue certificates for free. For those CAs, the slow response to Heartbleed by much of the business world is a mixed blessing, according to Comodo Chief Technology Officer Robin Alden. The costs of dealing with the mass revocations will be high, but the uneven response has given certificate authorities a chance to keep up with requests.

“In spite all of the news out there, plenty of our customers are only just starting to respond,” Alden said. “It is not a good thing for Internet security as a whole, but at least the fact that they are taking time to respond spreads out the load of re-issuing certificates.”

With analysis firm Netcraft estimating that some 500,000 sites use versions of OpenSSL that are vulnerable to Heartbleed — and with many companies using private SSL certificates inside their own networks — other certificate authorities could face similar floods of requests and burgeoning revocation lists, taxing the certificate infrastructure that underpins much of the Internet’s security.

“Beyond the cost, many CAs are not set up to be able to handle this increased load,” Matthew Prince, CEO of CloudFlare, stated in an analysis of the costs of mitigating Heartbleed. “Revoking SSL certificates threatens to create a sort of denial-of-service attack on their own infrastructures.”


Revocation process lacking authority

Despite playing a critical part in the Internet’s reliance on SSL, certificate revocation has uneven support. Before Heartbleed, revocation was an uncommon event and most users rarely encountered a revoked certificate. As the attacks on the Canada Revenue Agency and widespread detection of scans for the Heartbleed vulnerability show, attackers are now using the vulnerability to try and gather information on passwords and certificates.

Those attacks make revocation, and the support for blocking revoked signatures, extremely important, said Brian Trzupek, vice president at security services provider and certificate authority Trustwave.

“The concern is that the revocation mechanisms en masse have not seen this kind of test on them in such quantities,” Trzupek said.

When a certificate is compromised, CAs can choose one of two ways to communicate that untrustworthiness to Internet users and browsers. Browsers can either make a request every time they encounter a new certificate – usually when the user goes to a new website — using the Online Certificate Status Protocol (OCSP) to check whether the certificate has been revoked, or they can occasionally download a copy of the CRL and use the list to determine the trustworthiness of an SSL certificate.

The OCSP method requires less bandwidth for each additional lookup, though potentially more over time, while browsers that utilize CRL require fewer, but larger downloads to make the same certificate check.

Different browser makers use different methods. Google’s Chrome, for example, synthesizes its own lists from OCSP and CRL information, while other browsers like Mozilla’s Firefox have stopped using CRLs altogether. Regardless of which method browsers ultimately rely on, the sheer amount of revoked certificates in recent weeks — the SANS Institute’s Internet Storm Center pegged the increase in recent revocations at somewhere between 300% and %500 — is creating a wave of traffic that is likely to cause issues, particularly for mobile devices with less processing power and memory than traditional PCs.

“All the sudden you are going to have something on a wireless or cellular connection downloading, not a 2 kilobyte CRL, but something that is 3 or 4 megabytes,” said Trustwave’s Trzupek. “And that is going to put a lot of strain on these devices.”

Even worse, the revocation process may not be delivering that much in the way of security benefits for many users, according to Michael Klieman, senior director of product management for Symantec’s Trust Services team, because most browsers do not prevent a user from accessing a site with a bad, or revoked, certificate.

“Blocking user access to websites is not in the browser makers’ interest,” Klieman said, “but from a security standpoint, it has to be done to protect users.”


Heartbleed’s long tail

Unfortunately, the issues surrounding Heartbleed are unlikely to be resolved soon. While security firms have urged companies to change their keys, Trzupek said the OpenSSL vulnerability will likely remain unpatched on many platforms. Some systems, such as Web servers using the secure HTTP protocol (HTTPS), are obviously vulnerable, but so are a number of less visible Internet platforms, including mail servers and proxy servers. Millions of mobile devices running the Jellybean 4.1.1 version of Google’s Android operating system may also be vulnerable, though the practicality of Heartbleed-based attacks against such devices remains unclear.

Kevin Bocek, vice president of security strategy and threat intelligence for key management technology vendor Venafi, emphasized that business can’t address the problem of potentially compromised keys until after they’ve patched their systems, a reality he said many companies still do not understand.

“There have been a lot of misperceptions,” Bocek said. “People believe that they just need to patch public-facing systems, while some feel that all they need to do is reissue certificates.”

As part of the Heartbleed clean-up process, Bocek advised companies to uncover all of the systems where SSL keys may be used for security, prioritize the patches based on the criticality of each system and, once updated, create new keys for the systems. Even private CAs, used inside many large corporations to secure internal access to servers at a lower cost, should be patched and re-keyed, Bocek noted. Otherwise, any attacker that gains some level of access to systems inside a company will easily be able to compromise the entire network.

“It is the last-mile problem,” Bocek said. “You have keys and certificates used across all these applications, but you don’t know where they are used.”


Microsoft Office whips Google Docs: It’s finally game over

By Preston Gralla

April 17, 2014 12:00 PM EDT90 Comments

If there was ever any doubt about whether Microsoft or Google would win the war of office suites, there should be no longer. Within the last several weeks, Microsoft has pulled so far ahead that it’s game over. Here’s why.

When it comes to which suite is more fully featured, there’s never been any real debate: Microsoft Office wins hands down. Whether you’re creating entire presentations, creating complicated word-processing documents, or even doing something as simple as handling text attributes, Office is a far better tool.

Until the last few weeks, Google Docs had one significant advantage over Microsoft Office: It’s available for Android and the iPad as well as PCs because it’s Web-based. The same wasn’t the case for Office. So if you wanted to use an office suite on all your mobile devices, Google Docs was the way to go.

Google Docs lost that advantage when Microsoft released Office for the iPad. There’s not yet a native version for Android tablets, but Microsoft is working on that, telling GeekWire, “Let me tell you conclusively: Yes, we are also building Android native applications for tablets for Word, Excel and PowerPoint.”

Google Docs is still superior to Office’s Web-based version, but that’s far less important than it used to be. There’s no need to go with a Web-based office suite if a superior suite is available as a native apps on all platforms, mobile or otherwise. And Office’s collaboration capabilities are quite considerable now.

Of course, there’s always the question of price. Google Docs is free. Microsoft Office isn’t. But at $100 a year for up to five devices, or $70 a year for two, no one will be going broke paying for Microsoft Office. It’s worth paying that relatively small price for a much better office suite.


Google Docs won’t die. It’ll be around as second fiddle for a long time. But that’s what it will always remain: a second fiddle to the better Microsoft Office.


Microsoft to complete Nokia deal on Friday — and become big-time Android player

By Preston Gralla

April 21, 2014 11:42 AM EDT6 Comments


When Microsoft completes its $7.2 billion deal to buy Nokia on Friday, it will be doing more than buying a mobile phone division and beefing up its commitment to Windows Phone. It will also become a big-time Android player.

Brad Smith, Microsoft General Counsel & Executive Vice President, Legal & Corporate Affairs, announced on the Official Microsoft Blog today that the Nokia deal will be finalized this Friday. The deal is essentially the same one that had been previously announced, with some minor tweaks, including that Microsoft won’t buy Nokia’s Korean manufacturing facility, that 21 Chinese Nokia employees will join Microsoft rather than stay with Nokia, and that Microsoft will be in charge of and its related social media sites for up to a year.

On the blog, Smith’s explanation of the reason for the purchase seems to be straightforward — it’s a way to help Windows Phone succeed. But if you read between the lines, you’ll see that it’s about more than that, and is about Android as well. He wrote:

“This acquisition will help Microsoft accelerate innovation and market adoption for Windows Phones. In addition, we look forward to introducing the next billion customers to Microsoft services via Nokia mobile phones.”

Note that when he talks about introducting Nokia’s customers to Microsoft services, he doesn’t say Windows Phone will be the means of introduction. Instead, he says that “Nokia mobile phones” will do it.

You can be sure that was no accident. Microsoft has finally recognized that its future isn’t in Windows, or Windows Phone for that matter, but in services, such as Bing, OneDrive, Bing Maps, and more. That’s copying the way that Google monetizes Android, and Microsoft recognizes it’s the way to monetize its own mobile and other efforts.

Expect Nokia to push its line of Android phones hard in the developing world, where it’s particularly strong. That’s the place where many of the “next billion customers” will be introduced to Microsoft services. Microsoft hopes that eventually those customers will migrate to Windows Phones. But if they don’t, Microsoft will be perfectly happy that they continue using Nokia-made Android phones with Microsoft services on them.

So this Friday, when the deal goes through, Microsoft will immediately become a big player in Android, world-wide.


US Air Force’s Secretive X-37B Space Plane Nears Day 500 in Orbit

By Leonard David,’s Space Insider Columnist | April 21, 2014 01:21pm ET


The U.S. Air Force’s mysterious robotic X-37B space plane is sailing toward the 500-day mark in Earth orbit on a secret military mission.

The X-37B space plane presently in orbit is carrying out the Orbital Test Vehicle 3 (OTV-3) mission, a classified spaceflight that marks the third long-duration flight for the unmanned Air Force spaceflight program. The miniature space shuttle launched on Dec. 11, 2012.

The record-breaking X-37B mission now underway uses the first of the Air Force’s two robotic space plane vehicles. This same space plane flew the first-ever X-37B mission (the 225-day OTV-1 flight in 2010), and a second vehicle flew the longer OTV-2 mission in 2011, chalking up 469 days in orbit.

X-37B space planes launch into orbit atop an unmanned Atlas 5 rocket from a pad at the Cape Canaveral Air Force Station in Florida. The first two space-plane missions flew back to Earth on autopilot, each time touching down on a tarmac at the Vandenberg Air Force Base in California.

Earlier this year, the X-37B supplier Boeing Space & Intelligence Systems announced plans to consolidate space-plane operations by using NASA’s Kennedy Space Center in Florida as a landing site for the space plane.


Intelligence-gathering space plane

An X-37B space plane is about one-fourth the size of a former NASA space shuttle and uses a deployable solar array for power. It weighs 11,000 lbs. (4,990 kilograms) and has a small payload bay about the size of the bed of a pickup truck.

Each X-37B spacecraft measures about 29 feet (8.8 meters) long and nearly 15 feet (4.5 m) wide, and has a payload bay that measures 7 feet (2.1 m) long and 4 feet (1.2 m) wide. The space plane can operate in orbits that fly between 110 miles (177 kilometers) and 500 miles (805 km) above the Earth.

The secret missions for X-37B space planes are carried out under the auspices of the Air Force Rapid Capabilities Office, and mission control for OTV flights are handled by the 3rd Space Experimentation Squadron at Schriever Air Force Base in Colorado.

This unit is billed as the Air Force Space Command’s premier organization for space-based demonstrations, pathfinders and experiment testing, gathering information on objects high above Earth and carrying out other intelligence-gathering duties.

And that may be a signal as to what the robotic craft is doing — both looking down at Earth and upward.


X-37B and U.S. military space

X-37B at Vandenberg Air Force Base RunwayPin It An X-37B robotic space plane sits on the Vandenberg Air Force base runway during post-landing operations on Dec. 3, 2010. Personnel in self-contained protective atmospheric suits conduct initial checks on the robot space vehicle after its landing.

Credit: U.S. Air Force/Michael StonecypherView full size image

Just how the trio of X-37B clandestine missions might fit into the military’s strategic space plans is speculative. However, recent testimony before Congress of top U.S. military space brass underscores the overall fervor for “space control.”

Space control requires knowledge derived from satellite situational awareness to warn and assess threats that pose a risk to U.S. and coalition space operations, Lt. Gen. John Raymond, commander of the Joint Functional Component Command for Space, said before the House Armed Services Subcommittee on Strategic Forces earlier this month.

“Space control may also include threat avoidance, safeguarding of our on-orbit assets and the ability to mitigate electromagnetic interference,” Raymond testified.


Decision to declassify

Some analysts believe that the space-plane missions could be flying sensor gear useful for a recently declassified activity, the Geosynchronous Space Situational Awareness Program (GSSAP).

GSSAP will deliver two satellites for a single launch that are headed for near geosynchronous orbit (GEO). From that vantage point, they will survey objects in the GEO belt to track both known objects and debris and to monitor potential threats that may be aimed at this critically important region.

“Our decision to declassify this program was simple: We need to monitor what happens 22,000 miles (35,000 km) above the Earth, and we want to make sure that everyone knows we can do so,” testified Douglas Loverro, deputy assistant secretary of defense for space policy for the U.S. Department of Defense.

GSSAP satellites will communicate information through the worldwide Air Force Satellite Control Network ground stations, and then to Schriever Air Force Base, where the 50th Space Wing satellite operators will oversee day-to-day command and control operations.

Size comparison of the X-37B, X-37C, Shuttle, and Atlas V EELV.Pin It This size chart shows how the Boeing-built X-37B robot space plane compares to NASA’s space shuttle, a larger version of the spacecraft called the X-37C and an Atlas 5 rocket.


A strategic crossroad

The commander of Air Force Space Command, Gen. William Shelton, also testified at the same April 3 hearing, telling lawmakers he believed “we are at a strategic crossroad in space.”

Shelton, who first unveiled the once-classified GSSAP in February, said the two spacecraft expected to launch in 2014 will collect space situational awareness data, thus allowing for more accurate tracking and characterization of human-made orbiting objects in a near-geosynchronous orbit. [See amateur video of the X-37B space plane from March 4]

“Data from GSSAP will contribute to timely and accurate orbital predictions, enhance our knowledge of the geosynchronous environment and further enable spaceflight safety to include satellite collision avoidance,” Shelton said.


More things to come

As an experimental spacecraft, the X-37B is a precursor of things to come, said Marshall Kaplan, a space consultant and principal at Launchspace Inc., a training group for space professionals based in Bethesda, Md.

“It gives a certain amount of flexibility that we haven’t had before,” given that the craft flies and lands without a crew, is able to be reused and can haul specialized payloads for certain types of surveillance and other types of missions related to national security, Kaplan said.


But given that the craft is lofted by an Atlas 5 rocket — an expensive boost — “what we really need now is a cheap booster … which we don’t have,” Kaplan told “It’s the missing element.”

Kaplan said to keep an eye on the Defense Advanced Research Projects Agency’s (DARPA) Experimental Spaceplane (XS-1) program. DARPA seeks to lower satellite launch costs by developing a cheap, reusable first stage that would fly to hypersonic speeds at a suborbital altitude, he said.

“In the big picture of things, these two programs [X-37B and XS-1] could come together at some point in the future and be operational,” Kaplan said.


Fast follower

Whatever its utility, how an on-going X-37B program will play out in China is on the mind of Everett Dolman, professor of military strategy at the School of Advanced Air and Space Studies at Maxwell Air Force Base in Alabama.

One early indication is that China has purportedly pushed forward on its own “Shenlong” space plane planning.

“As far as the Shenlong is concerned, I am pretty much in agreement at this point that it is part of a broader ‘fast follower’ program similar to the Soviet Union’s adaptive approach in the Cold War,” Dolman told

Just as the former Soviet Union felt a need to develop its own space shuttle — the remotely piloted Buran that only flew once — Dolman said “the Chinese probably are concerned about a sudden leap in technology or tactics that would give a decisive, if temporary, edge to the U.S. should it be unveiled at a critical moment.”

“By keeping a close watch and matching what appears to be a high-priority technological capability, the fast follower spends less on research and development and can, hopefully, close the technology gap quickly,” Dolman said.

It is a second-best strategy for long-term competition in business, Dolman said, adding that he’s not sure it is even that for potential combat scenarios. “But the People’s Republic of China obviously believes the U.S. is committed to the X-37B and doesn’t want to be left tying its shoes in the gate when the starting-pistol sounds,” he concluded.


North Dakota UAS test site will be first in US to start operations

by Press • 22 April 2014


GRAND FORKS – The announcement Federal Aviation Administrator Michael Huerta stopped in Grand Forks to make Monday was one stakeholders in North Dakota’s growing unmanned aircraft systems industry have been waiting months to hear.


“Today, the FAA is granting the first authorization in the United States to allow a test site to start flying unmanned aircraft,” Huerta told a group gathered in the University of North Dakota’s Odegard Hall. “And that test site is right here in North Dakota.”

The test site will be overseen by the Northern Plains Unmanned Air Systems authority and provide a place to research integration of UAS into general airspace, which is mandated for 2015.

The test site also would explore civilian uses and help craft certification requirements for unmanned aircraft, also known as drones.

Bob Becklund, head of the state’s UAS authority, said the test site’s personnel and basic infrastructure are already in place.

“We are ready. We’ve been working hard since our selection in December to be prepared,” he said. “We plan to start flying here in early May.”

The FAA was required to have at least one test site up and running 180 days after announcing its selections. North Dakota’s test site and five others in Alaska, New York, Nevada, Texas and Virginia, are projected to operate until early 2017.

“North Dakota has really taken a leadership role in supporting unmanned aircraft,” Huerta said. “I look forward to the great contributions this state is going to make.”


Research use

Before unmanned aircraft can take off, North Dakota’s test site must pass a compliance check, according to Becklund.

If given the green light, the Draganflyer X4ES – small helicopter-like UAS – will take to the skies. Its first missions won’t focus on human surveillance but rather on agriculture and ecology.

At North Dakota State University’s Research Extension Center near Carrington, the aircraft will monitor crop conditions and soil quality.

Later this summer, it will be used to generate population counts of deer, elk and bison at the Sullys Hill game preserve south of Devils Lake.

“North Dakota is a perfect spot for our nation to develop UAS technology and procedures, and help unleash the economic potential of this promising industry,” said Sen. Heidi Heitkamp, D-N.D.

In Minnesota, Northland Community & Technical College was approved last week for a certificate of authorization to fly unmanned aircraft in Roseau County.

Those aircraft also will be used for agricultural research on farmland, according to Northland staff.

The collaboration on UAS research initiatives between the two states will give North Dakota’s test site a leg up on the others, according to Sen. John Hoeven, R-N.D.

“This truly is a two-state test site,” he said. “Right out of the blocks that gives us a huge advantage.”


Privacy concerns

Huerta and others acknowledged the launch of the test site’s research projects could feed concerns surrounding drones’ potential to invade residents’ privacy.

Creating privacy requirements was part of the test site application, according to Huerta. Applicants crafted a privacy policy that complies with local, state and federal privacy laws. The policy also must be available to the public and updated annually.


“The FAA has not traditionally regulated the use of anything. We regulate that it can be flown safely, but we don’t say that you can fly this airplane from here to there,” Huerta said. “Personally, I think we’d be terrible at regulating privacy, it’s not in our wheelhouse, it’s not something we understand how to do.”

Creating regulations and limits for drone use eventually would fall to lawmakers and courts.

Despite some of these concerns, North Dakota has spent $14 million advancing UAS research and development, according to Gov. Jack Dalrymple.

“This could turn into a very big deal for our state,” he added.

Of that amount, $5 million was dedicated to support the launch of Grand Sky, a UAS tech park planned for Grand Forks Air Force Base. Aerospace company Northrop Grumman has already committed to being an anchor tenant for the facility.

The state’s financial investment combined with a high ethical standard for UAS research set by entities such as UND’s Unmanned Aerial Systems Research Compliance Committee, could attract other businesses, according to several speakers.

“The private sector will look at this example and say ‘This is where we want to invest, this is where we want to create jobs, this is where we want to open our next business,’ ” said Rep. Kevin Cramer, R-N.D.


US developing public unmanned aircraft incident reporting system

by Press • 22 April 2014

Michael Cooney


It sounds like a good idea – develop an online system of publically reporting and disseminating problems or incidents stemming from the use of unmanned aircraft in the public airspace.

In practice you’d have to wonder if such a system would get used much because it seems like the system has just a few caveats.


Specifically the Department of Justice’s National Institute of Justice is looking for proposalsto develop, host, and maintain a web-based, online flight data and incident reporting system to, subject to law enforcement and national security concerns and limitations:

1. Collect fight-operations data from law enforcement and other public safety agencies from their use of sUAS (or small Unmanned Aircraft Systems defined as UAS weighing less than 55 lbs).

2. Make that information publically available for analysis by entities interested in the use of sUAS in the national air space.

3. In making this data readily accessible to the public, NIJ seeks to make possible further research and study of law enforcement and public safety sUAS flight operations, and through such research to improve the safety and increase the operational efficiency of law enforcement sUAS operations.


The NIJ says its ultimate goal for this solicitation – which could be worth $250,000 if a contract is awarded — is to foster the safe, effective, and lawful use of sUAS by law enforcement agencies.

The NIJ went on to says that the actual system will be designed through a collaborative process involving the successful applicant, NIJ, the FAA and other stakeholder organizations, the applicant should propose its own system design.

Here’s where it gets tricky though.

From the NIJ: “Among other topics, the proposal should address scalability, as the number of agencies operating sUAS are expected to grow. It should also address what the applicant sees to be potential data fields. The proposal should also address the nature of the agreements the applicant anticipates entering into with the law enforcement and other public safety agencies providing the data that will populate the database, including the understanding that law enforcement and other public safety agencies may not be able to provide data due to law enforcement sensitivity or national security needs. The discussion should also identify how any sensitive information that may be provided would be protected, or how agencies with limitations due to law enforcement sensitivity or national security concerns can supply use and safety of flight information for missions without compromising sensitive or classified operations.”

The NIJ request comes on the heels of the FAA announcing that the first of six test sites chosen to perform unmanned aircraft systems research is operational more than two-and-a -half months ahead of the deadline specified for the program by Congress.

The FAA said it had granted the North Dakota Department of Commerce team a Certificate of Waiver or Authorization (COA) to begin using a Draganflyer X4ES small UAS at itsNorthern Plains Unmanned Aircraft Systems Test Site. The COA is effective for two years. The team plans to begin flight operations during the week of May 5.

While supporting a North Dakota State University/Extension Service precision agriculture project, the Northern Plains Unmanned Aircraft Systems Test Site also will collect safety-related operational data needed for UAS airspace integration. The information will help the FAA analyze current processes for establishing small UAS airworthiness and system maturity. Maintenance data collected during site operations will support a prototype database for UAS maintenance and repair.


From the FAA, the other test sites include:

University of Alaska. The University of Alaska proposal contained a diverse set of test site range locations in seven climatic zones as well as geographic diversity with test site range locations in Hawaii and Oregon. The research plan includes the development of a set of standards for unmanned aircraft categories, state monitoring and navigation. Alaska also plans to work on safety standards for UAS operations.

State of Nevada. Nevada’s project objectives concentrate on UAS standards and operations as well as operator standards and certification requirements. The applicant’s research will also include a concentrated look at how air traffic control procedures will evolve with the introduction of UAS into the civil environment and how these aircraft will be integrated with NextGen. Nevada’s selection contributes to geographic and climatic diversity.

New York’s Griffiss International Airport. Griffiss International plans to work on developing test and evaluation as well as verification and validation processes under FAA safety oversight. The applicant also plans to focus its research on sense and avoid capabilities for UAS and its sites will aid in researching the complexities of integrating UAS into the congested, northeast airspace.

Texas A&M University – Corpus Christi. Texas A&M plans to develop system safety requirements for UAS vehicles and operations with a goal of protocols and procedures for airworthiness testing. The selection of Texas A&M contributes to geographic and climactic diversity.

Virginia Polytechnic Institute and State University (Virginia Tech). Virginia Tech plans to conduct UAS failure mode testing and identify and evaluate operational and technical risks areas. This proposal includes test site range locations in both Virginia and New Jersey.


Photonics Applied: Terrestrial Imaging: Spectral imaging satellites monitor global vegetation health


By Gail Overton 

Senior Editor

Aircraft- and satellite-based imaging systems with multispectral capabilities are being deployed worldwide to monitor terrestrial vegetation and soil moisture with unprecedented resolution and accuracy.

The 2013/2014 season brought floods to Europe and plunged the eastern United States into a “polar vortex” winter, all while the western U.S. continued to suffer through a serious drought. Because such severe weather patterns have serious impacts on croplands and forest cover, aircraft- and satellite-based imaging systems are being increasingly deployed to monitor soil and vegetation health.

The National Oceanic and Atmospheric Administration (NOAA; Silver Spring, MD) and Satellite Imaging Corporation (Magnolia, TX) take advantage of satellite data to publish several types of green vegetation and drought indices. They are among numerous other institutions recognizing the importance of multispectral measurement data to monitor and understand global vegetation health with high-resolution imagery. Although real-time data processing is not yet possible, deployment of miniaturized satellite designs promise faster data streams and broader data access.

Please note that while light detection and ranging (lidar) technology is also playing an increasingly important role in vegetation monitoring and forest canopy studies, this article limits the equipment discussion to airborne and satellite-based multispectral imagers.

Mapping and monitoring vegetation health

To create its global Vegetation Health Index (VHI) maps, which are updated each week, NOAA relies on an image-processing algorithm to convert satellite imaging data to color-coded vegetation health data (see Fig. 1). The VHI index ranges from 0 to 100, characterizing changes in vegetation conditions from extremely poor (0) to excellent (100); fair conditions are coded by green colors (around 50) that change to browns and reds when conditions deteriorate (below 40 is a vegetation stress or indirect drought indicator) and to blues when they improve.

Reflecting a combination of chlorophyll and moisture content and changes in thermal conditions at the surface, the index uses an algorithm (see that combines visible light (VIS), near-infrared radiation (NIR), and thermal infrared radiation (TIR) radiance data from the advanced very high resolution radiometer (AVHRR) aboard the NASA-provided NOAA-19 polar-orbiting satellite.

Gathered in six spectral bands having wavelengths from 580–680, 725–1100, and 1580–1640 nm (VIS and NIR) as well as 3.55–3.93, 10.3–11.3, and 11.5–12.5 μm (TIR), the AVHRR VIS and NIR values are first converted to a Normalized Difference Vegetation Index or NDVI = (NIR-VIS)/(NIR+VIS) and the TIR values to brightness temperature (BT) values using a lookup table. Essentially, healthy vegetation that has high chlorophyll content and high biomass density absorbs most of the visible light that strikes its surface and reflects most of the NIR radiation due to a robust cell structure in the leaves and the lack of chlorophyll absorption, while unhealthy or sparse vegetation reflects more visible light and less NIR radiation.1


The NDVI and BT values are filtered in order to eliminate high-frequency noise and adjusted for non-uniformity of the land surface due to climate and ecosystem differences. The VIS and NIR data are pre- and post-launch corrected, and BT data are adjusted for nonlinear behavior of the NIR channel. These NDVI and BT values are then converted to the VHI values through a series of calculations that factor in historical averages (AVHRR data has been collected continuously since 1981) for the same time period.


While the AVHRR instrument gathers imaging data in six wavelength bands at 1 km spatial resolution, NOAA’s Visible Infrared Imager Radiometer Suite (VIIRS)—launched in 2011—gathers imagery at 375 m resolution in 22 bands from 412 nm to 12 μm for more complete spectral coverage with increased radiometric quality (see Fig. 3). As vegetation health maps continue to improve in quality, associated malaria threat maps, drought maps, and even crop-yield maps derived from the vegetation data likewise benefit from increased spatial resolution, accuracy, and more frequent revisits over time.

AgroWatch, other algorithms

For the individual farmer or for other forestry, mining, or real-estate development companies, more “personalized” vegetation maps require spatial resolutions on the order of 15 m or smaller pixels. To fill this need, DigitalGlobe’s (Longmont, CO) AgroWatch product is a color-coded Green Vegetation Index (GVI) map that has indicator values from 0 (for no vegetation) to 100 (for the densest vegetation). These GVI values are calibrated using specific crop information and based on customspectral algorithms that are less affected by variations caused by underlying soils or water.

The GVI allows users to accurately correlate their crop cover with industry standard vegetation measurements including the Green Leaf Area Index, plant height, biomass density, or percent canopy cover. In essence, GVI maps play the role of historic and still-used NDVI maps, but without the “soil noise” influence that plagues NDVI values.

“Since the original NDVI formula was devised back in the 1970s for Landsat 1 data, there are now dozens of NDVI-like formulas with myriads of spectral adjustment algorithms,” says Jack Paris, president of Paris Geospatial, LLC (Clovis, CA). With 47 years of experience in remote-sensing with NASA (JSC and JPL), Lockheed, DigitalGlobe, and several universities, Paris also developed numerous improved NDVI algorithms for companies like C3 Consulting LLC (Madison, WI), which is now a division of Trimble (Sunnyvale, CA), a provider of commercial solutions that combine global expertise in GPS, laser, optical, and inertial technologies with application software and wireless communications.

“My information-extraction algorithms for C3 called PurePixel can produce vegetation and soil maps that take into account several crop characteristics,” says Paris. “C3 also collects dozens of soil characteristics in the field that often correlate well with vegetation and soil conditions that come from aircraft-based or satellite-based images.” Paris adds, “In the 1980s, Dr. Alfredo Huete conducted experiments with rows of potted plants over a variety of light- and dark-colored [dry and moist soils]. He found that these kinds of soil variations influenced the surrounding vegetation’s reflectance values and caused classic NDVI values to be affected [the “soil noise” mentioned earlier]. This is just one example of why it is extremely important to address imaging anomalies when analyzing multispectral imagery so that vegetation vigor and health data can be accurately mapped and monitored.”

One such anomaly that greatly impacts how the farmer interprets localized crop information is the positional error induced by satellite imagery due to the fact that image data is skewed by the angle of the sensor and the sheer swath of land mass captured by the sensor at non-perpendicular angles—not to mention variations due to terrain unevenness. To correct these imaging errors so that a farmer knows which rows of corn might require more fertilizer, for example, companies like Satellite Imaging Corporation offer orthorectification (see services to produce vegetation maps that overlay accurately to ground-based terrain maps.

Orthorectification is a process where the natural variations in the terrain and the angle of the image are taken into consideration and, by using geometric processes, are compensated for to create an image that has an accurate scale throughout. Satellite Imaging Corporation says that if satellite sensors acquire image data over an area with a kilometer of vertical relief and the sensor has an elevation angle of 60° (30° from the perpendicular from the satellite to ground), the image will have nearly 600 m of terrain displacement. To accurately remove image distortions, a digital elevation model (DEM) is used to perform orthorectification via feature extraction by both high-resolution stereo satellites like GeoEye-1, the Worldview and Pleiades series, IKONOS, or ASTER, as well as through stereo aerial photography images.

Drought and ice maps

Among the information found at the U.S. Drought Portal ( is a drought map produced each week by a by a rotating group of authors from the U.S. Department of Agriculture, NOAA, and the National Drought Mitigation Center. The National Aeronautics and Space Administration (NASA) also produce drought maps using NDVI data from the moderate resolution imaging spectroradiometer (MODIS) aboard NASA’s Terra and Aqua satellites. “By far, this is the driest year we have seen since the launch of MODIS [in 1999],” said Molly Brown, a vegetation and food security researcher at NASA’s Goddard Space Flight Center. Terra MODIS and Aqua MODIS view the entire Earth every one to two days, acquiring data at a peak 10.6 Mbit/s data rate in 36 spectral bands with spatial resolution that varies from 250 to 1000 m (depending on the spectral band). In addition to the VIS and NIR bands providing inputs to vegetation health, these and other spectral bands can monitor ice coverage as well.

“Multispectral information can be used to study ice and melting processes; the underlying physics is that snow and ice respond differently at different wavelengths,” says Marco Tedesco, director of the Cryospheric Processes Laboratory at the City College of New York (New York, NY).2 “For example, multispectral data can separate snow from ice by combining both visible and infrared data. Dirty snow can look like ice if we use only the visible, but using other wavelengths increases our confidence in the results-important because ice melts faster than snow and liquid water can flow faster over ice than over snow, with important implications in glacial melting studies.” Tedesco continues, “We can also use multispectral data to separate between ‘new’ snow, which is spectrally very bright, and ‘old’ snow, which has undergone several melting/refreezing cycles and absorbs more solar radiation-further increasing warming and melting.”

Next-generation satellite imaging

Commercial satellites are primarily manufactured by six major firms including Boeing, Lockheed Martin, Thales Alenia Space, and Astrium Satellites. Multispectral data is gathered by sophisticated instrumentation riding on multibillion-dollar, meter-resolution-capable commercial satellites weighing thousands of kilograms. But like everything else in the photonics industry, miniaturization is rapidly changing the way that future satellite imagery will be obtained.

In December 2013, the first images and video were released from the 100 kg (minifridge-sized) Skybox Imaging (Mountain View, CA) satellite. And by this time, Planet Labs (San Francisco, CA) had already launched four satellites. By February 2014, the first of 28 total phone-book-sized (10 × 10 × 34 cm) Planet Labs’ ‘Dove’ satellites comprising Flock 1 were launched from the International Space Station—representing the largest constellation of Earth imaging satellites ever launched. Flying at altitudes of roughly 500 km vs. 1000 km for traditional satellites, these micro-satellites will allow startups Skybox and Planet Labs to supply nearly real-time, comparable- or higher-resolution imaging data to a broader audience at reportedly lower prices than legacy satellite image providers.3


Customers can access the Skybox satellite video stream as quickly as 20 minutes after imagery is obtained by purchasing a SkyNode terminal—a 2.4 m satellite communications antenna and two racks of electronics. While much of the terabyte-per-day data is processed directly onboard the satellite (whose circuitry consumes less energy than a 100 W light bulb), open-source software like Hadoop from Apache Software Foundation ( lets customers use Skybox data-processing algorithms or allows them to integrate their own custom algorithms in their SkyNode terminal.

Skybox’s SkySat-1 collects imagery using five channels: blue, green, red, NIR, and panchromatic (all resampled to 2 m resolution in compliance with their NOAA license). And just like NOAA, Skybox is producing customized algorithms for mapping and monitoring vegetation health: the Modified Soil Adjusted Vegetation Index (MSAVI) from Skybox takes the NDVI metric one step further by correcting for the amount of exposed soil in each pixel in agricultural areas where vegetation is surrounded by exposed soil.

“Small satellites from Skybox Imaging and Planet Labs will revolutionize global vegetation health mapping and monitoring by enabling not just big corporations, but even the family farmer to access sub-meter-resolution imagery as quickly as crops grow,” says Jack Paris. In addition to micro-satellite data, Paris is still waiting for public access to drone-based imagery in the U.S., which can be collected at centimeter-level spatial resolutions. “Drone data, along with data from satellites and manned aircraft, will really open some doors and allow for better management of farmland with more efficient use of precious resources such as water, pesticides, and fertilizers with increased yields—a win-win situation for everyone!”


1. See

2. See

3. See

Commentary: Defense Civilian Layoffs Won’t be Pleasant, But They Are Necessary

By Rep. Ken Calvert April 10, 2014176 Comments


Against the backdrop of an increasingly unstable world, including the Russian invasion of Crimea, the ongoing conflict in Syria, an agitated Iran, aggression from the North Koreans and a militarized China, Defense Secretary Chuck Hagel recently announced that he would seek further cuts to our uniformed personnel. President Obama’s fiscal year 2015 budget would reduce the U.S. Army end strength to pre-World War II levels and would come on top of a reduced Air Force, Navy and Marine Corps. Our uniformed personnel continue to absorb cuts while the secretary of Defense ignores a significant portion of his budget that has continued to grow without restraint – the Defense Department’s civilian workforce.


From 2001 to 2012, the active duty military grew by 3.4 percent while the number of civilian defense employees grew by an astounding 17 percent. Since 2009, the size of the Office of the Secretary of Defense civilian workforce has grown to more than 2,000 people, an increase of nearly 18 percent. The Joint Staff grew from 1,286 people in 2010 to 4,244 people in 2012, a 230 percent increase. Currently the United States has 1.3 million active duty military personnel as compared to 770,000 civilian personnel, a ratio that is out of balance. In 2003, during the conflicts in Iraq and Afghanistan, for every 2.25 active duty personnel there was one civilian worker in support. Incredibly, today, the civilian concentration is even higher — for every 1.79 active duty personnel there is now one civilian worker in support.

The growth of the civilian workforce within the Defense Department continues to create a significant budgetary burden but, more importantly, if we fail to act, it will threaten our men and women in uniform. That is exactly why current and retired military leaders have widely acknowledged the need to establish a more efficient defense workforce in order to preserve our national security posture in the future. As Ret. Maj. Gen. Arnold Punaro, a member of the Defense Business Board, pointed out in a speech on Capitol Hill last fall, “We are increasing the overhead and decreasing the warfight.”

Numerous presidential administrations have tried, and failed, to rein in the Defense civilian workforce. It is clear that the department needs not only the authority to do so, but a mandate to reduce the size of its civilian workforce. That is why I introduced the Rebalance for an Effective Uniform and Civilian Employees (REDUCE) Act. The bill (H.R. 4257) would require the Defense Department to reduce its civilian workforce by 15 percent in the next five years, a percentage recommended by the Defense Business Board, which makes recommendations on how to improve the efficiency of the Defense workforce. The Defense Department would then be required to be at or below this established cap of a 15 percent reduction for fiscal years 2021 through 2025.

To ease the inherent difficulty of making staff reductions, the bill would provide the secretary of Defense the authority to use voluntary separation incentive payments and voluntary early retirement payments to achieve the required reductions in personnel. Importantly, it gives the secretary of Defense the authority to give more weight to the performance (versus tenure or other factors) of an employee in reduction in force decisions.

It is commonly known that it is almost impossible to fire a civilian worker for subpar performance. In fact, it is easier to reduce the responsibilities of an ineffective civilian worker and bring in a uniformed member of the military to do the job. U.S. taxpayers are essentially paying two people for one job, along with all the benefits that a federal job conveys. This is an unacceptable and unsustainable model.

The REDUCE Act does not call for indiscriminate cuts of civilian workers — it would require the secretary of Defense to review employees based on their performance and retain the best and brightest of our civilian workforce. This is a fair and appropriate step to bring the number of civilian employees in balance with our uniformed force. As a former small business owner, I understand what it means to sign the back of a paycheck, and while it is never easy to lay off a worker, we cannot continue to pay individuals for poor performance to the detriment of our active duty military members, other dedicated and effective civilian workers and the U.S. taxpayers. After 10 years of war, there are simply many civilian jobs that no longer need doing.

As the House and Senate Armed Services and Defense Appropriations committees continue to investigate this issue, I suspect we will find that the increase of personnel has occurred to a greater degree with paper pushers at the Pentagon, as opposed to wrench-turners at our depots. If that is the case, then the weight of the reduction should be proportionate to where the civilian staff increases occurred.

According to former Navy Secretary John Lehman, each defense civilian reduction of 7,000 personnel saves at least $5 billion over five years. Using his numbers and calculating a 15 percent reduction from the current level of 770,000 civilian employees, H.R. 4257 would save $82.5 billion over the first five years. Even by Washington standards, $82.5 billion is a significant amount that could be redirected within the department to fund military priorities such as modernizing weapons systems, improving readiness, resetting the force and, most importantly, providing for our fighting men and women in uniform.

At a time when our government must address mounting debt, every facet of our federal budget must be scrutinized. For too long, the Defense civilian workforce has been unrestrained in both growth and effectiveness. If left unchecked, the cost of civilian pay will account for two-thirds of the estimated escalation in the operations and maintenance portion of our defense budget in the coming decade. At a time when our military presence, and projection of power, is sorely needed in the world, we cannot risk further cuts to our uniformed personnel while the Defense civilian workforce remains untouched.



Rasmussen Reports

What They Told Us: Reviewing Last Week’s Key Polls


Bottom of Form

Saturday, April 26, 2014

Who’s in charge here?

Voters continue to believe Democrats have more of a plan for the future than Republicans do, but 53% think it is fair to say that neither party in Congress is the party of the American people.

The number of voters who rate President Obama’s leadership as poor (45%) is at its highest level ever.
His daily job approval rating remains in the negative mid- to high teens where it’s been for most of his presidency.

Only 35% now believe the United States and its allies are winning the War on Terror, and just 39% think the country is safer than it was before the terrorist attacks on September 11, 2001. That’s the lowest level of confidence on both questions in three years.

Meanwhile, even as the president tightens the screws on Russia over the political crisis in Ukraine, just 26% of voters view Ukraine as a vital national security interest for the United States these days.

On the home front, 61% now favor building the Keystone XL pipeline from Canada to Texas, the highest level of support yet. But the Obama administration is delaying a decision on building the pipeline until after the Nebraska Supreme Court rules on a legal challenge by environmentalists.

A sizable number of Americans think the environment is getting worse, and 47% are willing to pay more in taxes and utility costs to clean it up. But half (24%) of that group would be willing to pay only $100 more per year. Forty-six percent (46%) aren’t willing to pay a dime more.

Looking back, voters by a 49% to 30% margin continue to believe the government bailouts of the financial industry were bad for America.

Despite the bailouts, only 50% of Americans are even somewhat confident in the stability of the nation’s banks, with just 10% who are Very Confident. In July 2008, prior to the Wall Street meltdown, 68% were confident in the banking system.

Consumer and investor confidence remain little changed from where they have been in recent months.

Thirty-four percent (34%) of Americans say now is a good time for someone in their area to sell a house.

Thirty-eight percent (38%) expect their home’s value to go up over the next year. That’s the highest level of optimism since October.

Twenty-four percent (24%) of all Americans think that if someone cannot afford to make their mortgage payments, the government should assist them in making those payments. Sixty-one percent (61%), however, say they should sell their house and buy a cheaper one.

Most Americans say they are paying the same amount in interest as they were a year ago, but 50% expect to be paying more in a year’s time.

Forty-five percent (45%) of voters now view the new national health care law at least somewhat favorably, the highest level of support since October, but 51% continue to hold an unfavorable opinion of it.

Republicans have edged ahead of Democrats by one point on the Generic Congressional Ballot.

The U.S. Supreme Court heard arguments this week in a case aimed at overturning an Ohio law that makes it a crime to make false statements in a political campaign. But 55% of voters believe the government should be allowed to review political ads and candidates’ campaign comments for their accuracy and punish those that it decides are making false statements about other candidates.

The Supreme Court this week also upheld a Michigan law that prohibits the use of race as a determining factor in college admissions. Just 25% of Americans favor applying affirmative action policies to college admissions.

Thirty-six percent (36%) of voters think that if more money is spent on the public schools, student performance would improve.
Twenty-four percent (24%) consider the amount that is currently spent to be too much, while 28% say it’s about right.

In other surveys this week:

— Twenty-nine percent (29%) of Likely U.S. Voters think the country is heading in the right direction.

— Only 21% of Likely Florida Voters think Senator Marco Rubio should run for the presidency in 2016. However, 32% say they would vote for their senator if he is the Republican presidential candidate.

— Republican-turned-Democrat Charlie Crist, who Rubio defeated in the 2010 Senate race, leads incumbent GOP Governor Rick Scott 45% to 39% in Rasmussen Reports’ first look at Florida’s 2014 gubernatorial race.

— Republican Governor Sam Brownback leads Democratic challenger Paul Davis by seven points in his bid for reelection in Kansas.

— Fifty-four percent (54%) of Americans planned to attend a church service to celebrate Easter.

— Eighteen percent (18%) of Americans planned to celebrate Earth Day this past Monday.


From → Uncategorized

Comments are closed.

%d bloggers like this: