Skip to content

November 8 2014

November 10, 2014

8November 2014


Also on a blog at


Microsoft ends retail sales of Windows 7 and 8

3 November 2014

Microsoft has officially stopped selling retail copies of some versions of Windows 7 and 8.

The date to stop selling the software was set some time ago and should help Microsoft move people on to more recent versions of its operating system.

Separately, statistics suggest people are finally moving away from some very old versions of Windows.

The next version of Windows, called Windows 10, is due to be released in late 2015.

From 31 October, consumers could no longer buy copies of the Home Basic, Home Premium and Ultimate versions of Windows 7. Now, Windows 8 is also no longer available. The change affects both copies bought in shops or loaded on PCs and laptops.

The current version of Windows, 8.1, will be the default version offered on PCs.

The change will take time to feed through into the market, as many PC makers have large stocks of older versions of Windows and will continue to sell PCs running the software.

Those keen to get a computer running Windows 7 will be able to “downgrade” from 8.1 to Windows 7 Professional but relatively few PC firms offer this option.

Gordon Kelly, writing in Forbes, said the policy revealed “Microsoft’s determination to distance itself from the original form of Windows 8” despite it being released just over two years ago.

The original version of Windows 8 did not prove popular because it did away with some familiar elements of the desktop version of the operating system.

By contrast, he said, Windows 7 has been available since late 2009 and is still very popular among users. About 53% of Windows users are on the various editions of Version 7 of Windows, he said. The more recently released Windows 8 has only grabbed a 6% market share and has already been surpassed by 8.1, said Mr Kelly.

The change will also clear the path for the arrival of Windows 10, he added.

Separately, market analysis reveals that the numbers of people using the venerable Windows XP operating system has suddenly seen a sharp decline. Data from Netmarketshare suggests that in October this year its share dropped from almost 24% to just over 17%. It is not yet clear what was behind the fall.


Big Oil Feels the Need to Get Smaller

Exxon, Shell, Chevron Pare Back as Rising Production Costs Squeeze Earnings


Daniel Gilbert in Houston and Justin Scheck in London

Nov. 2, 2014 7:08 p.m. ET


As crude prices tumble, big oil companies are confronting what once would have been heresy: They need to shrink.

Even before U.S. oil prices began their summer drop toward $80 a barrel, the three biggest Western oil companies had lower profit margins than a decade ago, when they sold oil and gas for half the price, according to a Wall Street Journal analysis.

Despite collectively earning $18.9 billion in the third quarter, the three companies— Exxon Mobil Corp. , Royal Dutch Shell PLC and Chevron Corp. —are now shelving expansion plans and shedding operations with particularly tight profit margins.

The reason for the shift lies in the rising cost of extracting oil and gas. Exxon, Chevron, Shell, as well as BP PLC, each make less money tapping fuels than they did 10 years ago. Combined, the four companies averaged a 26% profit margin on their oil and gas sales in the past 12 months, compared with 35% a decade ago, according to the analysis.

Shell last week reported that its oil-and-gas production was lower than it was a decade ago and warned it is likely to keep falling for the next two years. Exxon’s output sank to a five-year low after the company disposed of less-profitable barrels in the Middle East. U.S.-based Chevron, for which production has been flat for the past year, is delaying major investments because of cost concerns.

BP has pared back the most sharply, selling $40 billion in assets since 2010, largely to pay for legal and cleanup costs stemming from the Deepwater Horizon oil spill in the Gulf of Mexico that year.

To be sure, the companies, at least eventually, aim to pump more oil and gas. Exxon and Chevron last week reaffirmed plans to boost output by 2017.

“If we went back a decade ago, the thought of curtailing spending because crude was $80 a barrel would blow people’s minds,” said Dan Pickering, co-president of investment bank Tudor, Pickering, Holt & Co. “The inherent profitability of the business has come down.”

It isn’t only major oil companies that are pulling back. Oil companies world-wide have canceled or delayed more than $200 billion in projects since the start of last year, according to an estimate by research firm Sanford C. Bernstein.

In the past, the priority for big oil companies was to find and develop new oil and gas fields as fast as possible, partly to replace exhausted reserves and partly to show investors that the companies still could grow.

But the companies’ sheer size has meant that only huge, complex—and expensive—projects are big enough to make a difference to the companies’ reserves and revenues.

As a result, Exxon, Shell and Chevron have chased large energy deposits from the oil sands of Western Canada to the frigid Central Asian steppes. They also are drilling to greater depths in the Gulf of Mexico and building plants to liquefy natural gas on a remote Australian island. The three companies shelled out a combined $500 billion between 2009 and last year. They also spend three times more per barrel than smaller rivals that focus on U.S. shale, which is easier to extract.

The production from some of the largest endeavors has yet to materialize. While investment on projects to tap oil and gas rose by 80% from 2007 to 2013 for the six biggest oil companies, according to JBC Energy Markets, their collective oil and gas output fell 6.5%.

Several major ventures are scheduled to begin operations within a year, however, which some analysts have said could improve cash flow and earnings.

For decades, the oil industry relied on what Shell Chief Financial Officer Simon Henry calls its “colonial past” to gain access to low-cost, high-volume oil reserves in places such as the Middle East. In the 1970s, though, governments began driving harder bargains with companies.

Oil companies still kept trying to produce more oil, however. In the late 1990s, “it would have been unacceptable to say the production will go down,” Mr. Henry said.

Oil companies were trying to appease investors by promising to boost production and cut investment.

“We promised everything,” Mr. Henry said. Now, “those chickens did come home to roost.”

Shell has “about a third of our balance sheet in these assets making a return of 0%,” Shell Chief Executive Ben van Beurden said in a recent interview. Shell projects should have a profit margin of at least 10%, he said. “If that means a significantly smaller business, then I’m prepared to do that.”

Shell late last year canceled a $20 billion project to convert natural gas to diesel in Louisiana and this year halted a Saudi gas project where the company had spent millions of dollars.

The Anglo-Dutch company also has dialed back on shale drilling in the U.S. and Canada and abandoned its production targets.

U.S.-based Exxon earlier this year allowed a license to expire in Abu Dhabi, where the company had pumped oil for 75 years, and sold a stake in an oil field in southern Iraq because they didn’t offer sufficiently high returns.

Exxon is investing “not for the sake of growing volume but for the sake of capturing value,” Jeff Woodbury, the head of investor relations, said Friday.

Even Chevron, which said it planned to increase output by 2017, has lowered its projections. The company has postponed plans to develop a large gas field in the U.K. to help bring down costs. The company also recently delayed an offshore drilling project in Indonesia.

The re-evaluation has also come because the companies have been spending more than the cash they bring in. In nine of the past 10 quarters, Exxon, for example, has spent more on dividends, share buybacks and capital and exploration costs than it has generated from operations and by selling assets.

Though refining operations have cushioned the blow of lower oil prices, the companies indicated that they might take on more debt if crude gets even cheaper. U.S. crude closed Friday at $80.54 a barrel.

Chevron finance chief Patricia Yarrington said the company planned to move forward with its marquee projects and is willing to draw on its $14.2 billion in cash to pay dividends and repurchase shares.

“We are not bothered in a temporary sense,” she said. “We obviously can’t do that for a long period of time.”


The Pentagon May Finally Have a Plan to Keep America on Top

Bill Sweetman

Nov 3, 2014


After years of strategic drift, the U.S. military may finally have a path to maintain its edge over countries like China. Will the defense-industrial bureaucrats stand in the way?

Nowhere other than inside the Pentagon will you find more truth in Machiavelli’s warning about the hazards of change: “There is nothing more difficult to plan, more doubtful of success, nor more dangerous to manage… For the initiator has the enmity of all who would profit by the preservation of the old institution and merely lukewarm defenders in those who gain by the new.”

Which was why my response to Deputy Defense Secretary Bob Work’s arrival involved a reference to Raymond Chandler’s Big Willie Magoon, a vice cop who “thinks he’s tough.” The arrival of someone with genuine strategic and technical chops at the upper level of the Defense Department was such a good idea that a lot of people were guaranteed to respond with equal parts rage and terror.

Work’s co-thinkers have now run the pirate flag up the mast with the publication of a concise and hard-hitting report by the Center for Strategic and Budgetary Assessments that details what Work has called a Third Offset strategy for towing the Pentagon out of the strategic quicksand into which it is steadily sinking today.

My compressed version of the CSBA report is here, along with an explanation of the innocuously wonkish “Third Offset” name by which the new strategy is known. But to be even briefer, this is the gist of the strategy.

Widely available weapons—this is not all about China—are threatening the U.S. ability to project power and influence events worldwide. Those weapons include guided missiles, satellites, and drones that can track ships in mid-ocean, and long-range surface-to-air missiles.

Rather than wading into a symmetrical fight against those weapons, the Third Offset strategy exploits U.S. and allied core competencies—not just the things we do well, but areas where we can maintain our lead for a long time, and without adding to the defense bill. Think advanced unmanned vehicles, all-aspect, broadband stealth, and undersea warfare.

Third Offset calls for some new weapons, none of them miraculous, some of them a little more specialized than those that have been planned in the last decade or two.

As a strategy, it has the enormous merits of focus and consistency, which is why there are people and groups who are going to hate it and try to stop it from happening.

The strategy exploits not just the things America does weel. But areas where we can maintain our lead for a long time. Think advanced unmanned vehicles, all-aspect, broadband stealth, and undersea warfare.

First among these will be the boot-centric warfare (BCW) crowd, whose admiration for the military theorist Carl von Clausewitz has blinded them to the fact that our world is not Clausewitz’s, where armies ruled and the war was won when the enemy’s capital was occupied. They will not be mollified by another new CSBA report that proposes an expanded Army role in providing offensive and defensive regional missile support. They will portray Third Offset as the intellectual stepchild of one of those nutty airpower cheerleaders, and not the kind of warfare performed by Real Warriors. 

This is not completely inaccurate. Third Offset reflects the views of people inside and outside the Pentagon who see large-scale BCW, particularly in a counter-insurgent role against cultures that revere martyrdom, as akin to wrestling a pig: You both get covered in slime but the pig enjoys it.

Next will be the peace-hawks. No, Third Offset does not advocate war with China. It seeks to prevent war with China, or any other nation that wants to exploit anti-access and area denial to further its own interests at the expense of the global community. In the classic phrase of deterrence, we want all such actors to wake up each morning and think: Not today.

The fighter generals and the advocates for the biggest program in Pentagon history, the F-35 Joint Strike Fighter, will not be much happier. Lord knows I am not a Joint Strike Fighter fan, but I have yet to call it “semi-stealthy” as the CSBA report does. The report also suggests that the Navy’s F-35C might be usefully canceled. But the critique is deeper: In some scenarios, it matters little if the adversary’s fighters can’t defeat F-35s directly. Shoot down or drive off the tankers and the fighters never make it back.

Some naval aviators will be at best skeptical of the report’s embrace of carrier-based unmanned combat air systems. They should not be surprised: Work himself co-authored an early and influential study of Navy advanced drones at CSBA, identifying range as a critical factor in an anti-access/area-denial environment.

The Navy’s surface-combatant community and the U.S. shipbuilding enterprise will be clearing the decks for action. Third Offset strongly favors the submarine and implies that, as missile threats become more intense, the weapon tubes on surface warships will fill up with defensive interceptor missiles, leaving only a handful of weapons to fire at the enemy.

The CSBA report says little about the Marine Corps and never mentions the F-35B—the Corps’ version of the Joint Strike Fighter. However, it does mention all the short-range anti-access weapons, like guided rockets and mortars, weapons that Work (a retired Marine himself) talked about in his CSBA and Navy years as representing a very difficult challenge for amphibious warfare in general and the F-35B in particular.

Third Offset is not policy. Yet. But it’s an important and coherent starting point for a discussion that is long overdue.



Study: Long-term shift work lessens brain power

By Laura Smith-Spark, CNN

updated 11:38 AM EST, Tue November 4, 2014

(CNN) — It’s not the news that any shift worker wants to hear. Not only is working irregular hours bad for your social life and likely your health, but it has a chronic effect on your ability to think, a new study has found.

The study, published in the journal Occupational and Environmental Medicine, looked at the long-term impact on people’s cognitive abilities of working at odd hours or with frequently changing shifts.

Researchers in France and the United Kingdom followed employed and retired workers in southern France — some of whom had never worked shifts, while others had worked them for years — over the course of a decade.

They found that shift work was associated with impaired cognition, and the impairment was worse in those who had done it for longer.

The impact was particularly marked in those who had worked abnormal hours for more than 10 years — with a loss in intellectual abilities equivalent to the brain having aged 6.5 years.

The only encouraging finding for shift workers is that the decline can be reversed by a switch to regular hours. The bad news? It takes at least five years, the findings suggest, except for processing speeds.

Rotating hours

The authors say their research is the first published study into the reversibility of the chronic impact of shift work on the brain after the shift work finishes.

For the study, the participants were asked to carry out cognitive tests intended to assess long- and short-term memory, processing speeds and overall cognitive ability on three occasions, in 1996, 2001 and 2006.

Just under half of the sample, 1,484 people, had worked shifts for at least 50 days of the year.

Participants were aged exactly 32, 42, 52 and 62 at the time of the first set of tests. In all, just under 2,000 people were assessed at all three time points.

Around a fifth of those in work and a similar proportion of those who had retired had worked a shift pattern that rotated between mornings, afternoons, and nights.

Circadian rhythms

The researchers, from the University of Swansea and the University of Toulouse, say this is an observational study so no definitive conclusions can be drawn about cause and effect.

However, they suggest that disruption to the body clock could “generate physiological stressors, which may in turn affect the functioning of the brain.”

Humans are wired to sleep at night by their circadian rhythm, a 24-hour cycle that brings about physical, mental and behavioral changes in the body. The circadian rhythm affects sleep cycles, hormone releases, body temperature and various processes. Besides the intellectual impact, disrupting it has been associated with health problems including ulcers, heart disease and breast cancer.

Other research has also linked vitamin D deficiency caused by reduced exposure to daylight to poorer thinking skills, the researchers say.

“The cognitive impairment observed in the present study may have important safety consequences not only for the individuals concerned, but also for society as a whole, given the increasing number of jobs in high hazard situations that are performed at night,” the researchers warn.

At the very least, the findings suggest that monitoring the health of people who have worked shift patterns for 10 years would be worthwhile, they say.

It’s a message that merits attention.

It has been remarked that accidents tend to happen late in the night or in the early morning — as with the Three Mile Island disaster in 1979 and the Chernobyl disaster in 1986.


First drone takes off from Griffiss

by Press • 4 November 2014

On Oct. 28, the experimental test flight of a Logos Technologies unmanned aerial vehicle took place at the Griffiss UAS Test Site and the Northeast UAS Airspace Integration Research Alliance.

The vehicle flown was powered by a parafoil system, which combines a fabric wing (essentially a parachute) with an aircraft fuselage. Beyond the parafoil system’s ability to carry significant loads for its weight class, it provides increased safety over traditional UAVs, according to the release.

The vehicle is designed to further develop Logos’ flight control approach for inexpensively delivering supplies to dispersed units.

“We are excited about being the first to fly a UAS at the Griffiss FAA test site” said Dr. Wade Pulliam, director of Advanced Concepts at Logos Technologies. “(The) flight tests were a great success and allowed us to collect valuable data that will be critical to our development of parafoil systems for military and commercial applications.”

The ability to develop unmanned aerial systems could accelerate the development of such systems for many future applications, including precision agriculture, pipeline and power line monitoring, environmental monitoring, and disaster/humanitarian relief support.

“Through these first successful operational flights, the NUAIR Alliance and its partners have met a significant mandate for this test site,” said Larry Brinker, executive director of the NUAIR Alliance. “This is an exciting step into the future of aviation and the deployment of unmanned aircraft systems.”


The UPS Store rolls out 3D printing in Dayton

Nov 4, 2014, 1:12pm EST


Joe Cogliano

Senior Reporter- Dayton Business Journal

UPS has launched 3D printing services at nearly 100 U.S. locations, including one in the Dayton area.

As of this week, the UPS Store on Far Hills Avenue in Oakwood has a 3D printer for retail use. It is the only location the company lists so far in Ohio.

Also known additive manufacturing, 3D printing is a process of adding layers of thin materials to create a product or component.

UPS officials say the new rollout follows a successful pilot program in six markers.

3D printing is supposed to be the next big thing — for manufacturers as well as consumers — and this latest news should keep the Dayton region on the leading edge of adopting the technology.

In the last few weeks, Proto BuildBar opened its doors in downtown Dayton ( Click here to read that story.)

Proto BuildBar features a wall of 3D printers, as well as “maker” tables, in a cafe’/bar setting.

In August, 3D printer Tangible Solutions LLC moved into space om Germany Lane in Beavercreek. The company, which also performs modeling and design services, works with everyone from retail customers who come in with small orders to manufacturers who need anything from a prototype to large-run part orders.

Last fall, Ben Staub opened of the area’s first retail 3D printing operation, GetPrinting3D, on North Dixie Drive in Dayton.


US Air Force Wants Better Global Hawk Protection Against Cyber Attacks

November 5, 2014

U.S. Air Force unmanned aerial vehicle (UAV) experts are asking Northrop Grumman Corp. to upgrade the abilities of the Global Hawk unmanned aerial vehicle (UAV) to resist computer hackers’ attempts to commandeer the unmanned aircraft while in flight. Officials of the Air Force Life Cycle Management Center at Wright-Patterson Air Force Base, Ohio, are awarding a $33.9 million contract modification to the Northrop Grumman Aerospace Systems segment in San Diego to retrofit satellite communications links in the Global Hawk UAV fleet. The contract modification calls for Northrop Grumman to upgrade communications security in International Maritime Satellite links to enable Global Hawk UAVs to improve their defenses against cyber attack.

Related: Air Force orders three Block 30M RQ-4B Global Hawk UAVs for high-altitude surveillance Global Hawk, which can fly for more than 24 hours between refuelings, often relies on maritime satellite communications (SATCOM) links to enable ground controllers to maintain contact with the unmanned aircraft. The Air Force operates more than 40 Global Hawk unmanned aircraft. Air Force officials want to stiffen the Global Hawk’s maritime SATCOM links to prevent computer hackers from breaking through existing data security to take unauthorized control of the long-endurance, high-altitude Global Hawk reconnaissance UAVs. Recent studies have uncovered some flaws in satellite communications links that could enable cyber warfare attacks to disrupt military operations and ship and aircraft communications.


States ditch electronic voting machines

By Cory Bennett – 11/02/14 09:00 AM EST

States have abandoned electronic voting machines in droves, ensuring that most voters will be casting their ballots by hand on Election Day.

With many electronic voting machines more than a decade old, and states lacking the funding to repair or replace them, officials have opted to return to the pencil-and-paper voting that the new technology was supposed to replace.

Nearly 70 percent of voters will be casting ballots by hand on Tuesday, according to Pamela Smith, president of election watchdog Verified Voting.

“Paper, even though it sounds kind of old school, it actually has properties that serve the elections really well,” Smith said.

It’s an outcome few would have predicted after the 2000 election, when the battle over “hanging chads” in the Florida recount spurred a massive, $3 billion federal investment in electronic voting machines.

States at the time ditched punch cards and levers in favor of touch screens and ballot-scanners, with the perennial battleground state of Ohio spending $115 million alone on upgrades.

Smith said the mid-2000s might go down as the “heyday” of electronic voting.

Since then, states have failed to maintain the machines, partly due to budget shortfalls.

“There is simply no money to replace them,” said Michael Shamos, a computer scientist at Carnegie Mellon University who has examined computerized voting systems in six states.

The lack of spending on the machines is a major problem because the electronic equipment wears out quickly. Smith recalled sitting in a meeting with Missouri election officials in 2012 where they complained 25 percent of their equipment had malfunctioned in preelection testing.

“You’re dealing with voting machines that are more than a decade old,” Smith said.

Roughly half of the states that significantly adopted electronic voting following the cash influx have started to move back toward paper.

The Presidential Commission on Election Administration in January warned that the deterioration of voting machines is an “impending crisis,” but House Republicans say the issue should be left to the states.

Rep. Candice Miller (R-Mich.), who chairs the house committee that oversees federal elections and is a former Michigan secretary of State, said the cash infusion to the states in the mid-2000s was “unprecedented.”

“State and local election officials should not rely on the federal government to replace voting machines that may be nearing the end of its useful life. Therefore, state and local election officials should recognize that they are responsible for upgrading their voting equipment as needed, and hopefully they are budgeting accordingly,” Miller said in a statement to The Hill.

Some voters might welcome the return to punch voting, given that researchers have repeatedly proved the fallibility of individual e-voting machines.

One group from Princeton needed only seven minutes and simple hacking tools to install a computer program on a voting machine that took votes for one candidate and gave them to another.

More whimsically, two researchers showed they could install Pac-Man onto a touch-screen voting machine, leaving no detectable traces of their presence.

But concerns of widespread tampering are overblown, Shamos said.

“It’s something you can demonstrate under lab conditions,” he said. To translate it to an election-altering hack, “you would have to commit the perfect crime.”

“There’s never been a proven case of manipulation of an electronic voting machine,” he said.

Voting machines are not connected to any network and not connected to each other, making them difficult to tamper with.

“These machines are not hooked up or networked in any way that would make them vulnerable to external access,” said Matt McClellan, press secretary for the Ohio secretary of State. “We’re confident that process is secure and the integrity is being maintained.”

“There’s no mechanism whereby viruses can pass from one machine to another,” Shamos agreed. Best-case scenario, “maybe I could fool a few people” and get several hundred votes “for my guy.”

Bryan Whitener, director of communications for the U.S. Election Assistance Commission, noted that all electronic voting machines are tested and certified.

Many states, like Colorado, keep their machines under video surveillance with detailed records of when software is being installed.

When Ohio made the $115 million statewide switch to e-voting, it passed a law that all voting methods, including touch screens, must also generate a paper trail.

“It’s not just solely an electronic vote,” McClellan said.

More than 60 percent of states passed similar laws with the electronic switch. Some states moved preemptively; others were reactionary.

An electronic machine in North Carolina lost roughly 4,500 votes in a 2004 statewide race after it simply stopped recording votes. The race was ultimately decided by fewer than 2,000 votes.

“Now what do you do?” Smith asked. “You can’t really do a recount. There’s nothing to count.”

Within a year, the state passed a law requiring a paper back-up.

Paper trails are simply “more resilient,” Smith said.

Shamos said he expects the move back to paper ballots to continue, unless there’s a high-profile crisis similar to the 2000 election.

Still, he predicted the drumbeat for Internet and mobile voting will grow.

“Eventually [a generation is] going to have the thought that it’s idiotic for me not to be able to vote using my cell phone,” Shamos said.

Then all bets are off.


The White House Must Change its ISIS Strategy

Chris Miller

Posted on November 4, 2014

It is becoming clear that the American-led international coalition and its airstrikes in Iraq and Syria alone are not going to defeat ISIS. The U.S. government, its Western allies, and its Middle East partners, thus far, are against deploying ground troops to Iraq. The New Iraqi Army, a Shiite dominated organisation, has shown itself unwilling or incapable of defending Sunni-dominated western provinces. None of the parties concerned will commit soldiers to face ISIS, despite their acts of unspeakable violence and depravity. Current U.S. policy is against any military action in Iraq that does not come from the air, including firm support for non-conventional ground forces. The results thus far speak for themselves.

Coming Back to Bite Us

Scholars and security professionals have long been on the fence regarding the wisdom of supporting local non-conventional or conventional forces through military and security assistance programs. Many question if there is any tangible benefit in supporting friendly dictatorships, not to mention the intangible negative effects of supporting such systems. The same goes for supporting revolutionary forces, who at times topple an old adversary only to become a new one. Most recently, the Obama administration’s hesitancy to support opposition forces on the ground in Syria appears to have been heavily influenced by a CIA study that U.S. support for insurgent forces, specifically considering South American examples, has historically had little return.

However, this study, as many others which have covered the subject, does not provide satisfying answers. There are many factors which must be taken into account, such as the stage in the fight at which assistance begins, the speed and concentration of assistance, and the morale, local support, and battle momentum behind the supported forces. Nonetheless, current American policy is that the U.S. will only support military force in Iraq and Syria which it can apply itself directly and from the air or in the form of advice from military advisers. In fairness, it worked against Qaddafi in Libya.

As Clausewitz wrote, tactics is the use of troops to win battles. Strategy is winning war by winning battles. If America does not want to use its troops, it will not win battles. Therefore, it will not win the war. You cannot win a war you do not actually fight.

One of the main supporting veins of this type of thinking is that U.S. advisory and material assistance to foreign forces—conventional or non-conventional—can and does come back to bite us. The botched Bay of Pigs invasion made Castro paranoid about another attempt so he asked Khrushchev to place nuclear missiles on Cuba. U.S. and British involvement in the toppling of Mossadeq in Iran to be replaced by the Shah created the chasm that exists since the 1979 revolution. America supported Saddam Hussein against Iran and later went to war against him—twice. The U.S. supported Afghan mujaheddin and other fighters, such as Osama bin Laden, in their fight against the Soviets and they turned into al Qaeda and the Taliban we are still fighting today.

Comparing Apples to Oranges

However, basing decisions today on whether or not to intervene in foreign conflicts on America’s track record of success or failure in Cold War actions is comparing apples to oranges and leaves out what broader concerns drove those decisions to take action.

There are two ways to win a contest: Win by competing or, alternatively, make sure the other guy cannot win (and winning by default). The first option generally involves a head-on battle. The second involves careful planning, probing, deception, and even “dirty tricks”—espionage, sabotage, and proxy wars. During the Cold War, a head-on contest between the U.S. and USSR would have meant nuclear war, something out of the question for both sides.

The U.S. had to show up to every dirty match of the Cold War. If it did not, its enemy—the Soviet Union—would. Anywhere America did not show up, the enemy would win by default. And vice versa. Causing the Soviet Union to expend vast amounts of economic and political capital—which it did not have—was the long term strategy of the United States. It worked. Necessarily, this involved a Realist policy calculation of America becoming involved in places where and with people who, in better times with better options, it would have been better to stay away from.

It is correct that America married itself to brutal dictators and repressive regimes and kept them in power. Its track record of winning these battles and their follow on effects is mixed. However, it was a strategy that meant while some battles were won and some lost, every further battle meant the U.S. and its allies were moving closer to eventually winning the war with Soviet Union. We were winning even when we were losing. It worked. But we did do harm to do a greater good.

America’s Strategic Challenges Today

The Cold War is over and has been for a while. What kind of world is America facing today? As JCS Chairman Gen. Martin Dempsey puts it, the situation is “2, 2, 2 and 1”: Two heavyweights—Russia and China; two middleweights—Iran and North Korea; two non-states—al Qaeda affiliates and organised crime networks; and, one system—the cyber domain. It is no longer a bipolar world. Instead, we have a multipolar world with different foes competing at different levels in different places—all of which, however, in sum, means a bigger total problem for the U.S. than any of its opponents.

Since our strategic calculation is no longer based on the single concern of defeating the Soviet Union, the question is actually simpler: Do we want to defeat ISIS? If so, the U.S. can–unlike against the USSR–take the direct route to victory by openly confronting them militarily. America certainly has the capability. ISIS is, by all measures, certainly deserving of a resounding defeat. However, America’s leaders lack the will. America also has the capability to support local actors on the ground to engage ISIS in Iraq. However, it lacks the will to do this as well.

Clausewitz differentiated between tactics and strategy thus: “Tactics is the use of troops to win battles; strategy is the use of battles to win at war.” America’s tactics in the fight against ISIS, thus far, have been a poor showing. It does not, in fact, want to use troops at all, only airstrikes. Though it is killing ISIS fighters from the air, destroying their equipment, and degrading them in other ways, they remain in control of western Iraq and eastern Syria. America’s side is not winning the battles.

Perhaps it will, eventually. The calculation is that the U.S. and its allies will “degrade and defeat” ISIS through attrition over time—eliminating their troops and capabilities steadily from the air until they collapse. However, if this does not happen before they, say, take Baghdad or before they achieve other major victories, this timeline may continue to extend—and with more foreign fighters flocking to join them. Will America’s will to deploy ground troops be any greater then? Many American strategists also believed that with superior tactics, troops, and equipment they would defeat North Vietnamese communists over time. Perhaps they could have. But America and its leaders lacked the will to continue such a fight then as well. That is to say victory using such a strategy against such an enemy is not guaranteed.

A War Which Cannot Be Won Without Fighting It

It is hard to picture a scenario in these circumstances in which America will win this war without winning the battles. The longer ISIS exists, the longer its propaganda machine will continue to poison alienated, vulnerable minds internationally—as the Ottawa attack shows. The longer it exists, the more Shiites, Sunni resisters, Kurds, and non-Muslim minority groups will be murdered. American policy currently considers the downside of having to send American ground troops back into Iraq greater. American policy also considers the downside of providing material aid to local ground forces greater. It may come back to bite us.

However, is the prospect of ISIS continuing to exist, continuing to murder and enslave, continuing to disrupt regional stability, continuing to provide a calling and safe haven for Islamic extremism, continuing to prolong a Syrian civil war, and continuing to block the progress of a re-emerging Iraq any lesser an evil? The current U.S. strategy against ISIS seems to be doomed to lead, at best, to an eventual victory after yet another prolonged and indecisive Middle East intervention, with the same problems returning in another form a few years down the road. After all, ISIS itself is also the “same problem” popping up again. Remember, much of the early ISIS infrastructure was based around the former al Qaeda in Iraq (AQI) the U.S. previously fought in Western Iraq.

Strange enough, this is a fight America can win if it takes the enemy head on with military force. For once, the enemy is using massed ground forces with columns of vehicles and even armor. Yet America lacks the will to do so. This is a war we can win, but have decided not to. If America is unwilling to commit ground forces and unwilling to support local forces to fight ISIS, then bombing them seems to be a waste of effort, other than to perhaps assuage our consciences. America should perhaps refrain from acting at all in Iraq and Syria and simply focus its efforts on Gen. Dempsey’s “heavyweights”, its touted “Pivot to Asia”, regroup from its weak showing against Russia in Ukraine, or focus on its nuclear negotiations with Iran and North Korea. It is better not to waste time, focus, and effort.

One of the lessons of Afghanistan and Iraq is to fully commit to victory with sufficient focus, effort, material support and troop levels to assure victory or do not act at all. Middling solutions have cost us much in blood and treasure over the last dozen years. Against ISIS, America is, once again, taking a middling approach bound to be prolonged and indecisive.

As Clausewitz wrote, tactics is the use of troops to win battles. Strategy is winning war by winning battles. If America does not want to use its troops, it will not win battles. Therefore, it will not win the war. You cannot win a war you do not actually fight.

Chris Miller is a U.S. Army veteran and Purple Heart recipient following two tours in Baghdad, Iraq and has worked as a military contractor in the Middle East. His work currently focuses on strategic studies. His interests are CBRN, military and veterans issues, the Cold War, and international security affairs.


Prototype AI chip allows UAV to learn

Nov. 4, 2014 |


HRL Laboratories’ Center for Neural and Emergent Systems has tested an unmanned aircraft with a prototype neuromorphic chip — a processor that essentially acts like a brain, learning and acting on its own, Gizmodo reports.

The Defense Advanced Research Projects Agency is funding the research. According to the MIT Technology Review, the first time the tiny test craft was flown into a new room, “the unique pattern of incoming sensor data from the walls, furniture, and other objects caused a pattern of electrical activity in the neurons that the chip had never experienced before.” That caused the way its synthetic neurons connect to one another to change as the chip learned the layout of the room, to be remembered next time it enters.

Narayan Srinivasa, who leads HRL’s Center for Neural and Emergent Systems, told MIT Technology Review: “This shows it is possible to do learning literally on the fly, while under very strict size, weight, and power constraints.”

Aerovironment built the test drone, which is six inches square and weighs 93 grams. The chip accounts for just 18 grams of that weight, and used 50 milliwatts of power.


DOD tries to allay industry fears on intellectual property

By Sean Lyngaas

Nov 04, 2014

Addressing an audience of defense executives and military acquisition officials, Assistant Secretary of Defense for Acquisition Katrina McFarland on Nov. 4 sought to clarify how much intellectual property the Defense Department is interested in owning in a project it outsources to private industry. McFarland and other DOD leaders argue that maintaining a delicate balance of owning parts of a project to control its upkeep but not owning too much IP as to scare off bidders could help shore up the U.S. defense industry’s dwindling technological edge.

DOD does not want to own IP when it amounts to business secrets a firm can leverage for competitive advantage, McFarland told a conference hosted by Defense Daily in Washington, D.C. Rather, the department wants to own the interfacing part of a system that would be interoperable with another firm’s technology added later in a project, she said.

Defense Acquisition University defines open systems as those that use open, public interfaces and formats, making them interoperable and portable. To encourage competition for projects, and in turn save money in tight fiscal times, DOD has made open-systems architecture a key tenet of Better Buying Power 3.0, the latest edition of acquisition reform that the department unveiled in September. BBP 3.0 is now in draft form and won’t become DOD practice until the department hears more from industry, Congress and other stakeholders in the coming weeks.

McFarland took the podium to dispel some of the apparent misconceptions that defense firms have relayed to her about open-systems architecture. “There’s a lot of confusion related to intellectual property and open-systems architecture,” she said. “There is no interest in the government in pursuing intellectual property when it’s the secret sauce of the company.”

In introducing BBP 3.0 in a white paper, Frank Kendall, the undersecretary of Defense for acquisition, technology and logistics, wrote that DOD had had “varying degrees of success in the past” with pursuing open-systems architecture. “We need to do a better job of ensuring that our designs are modular — and that the government is in a position to control all the relevant interfaces so that competitors have the opportunity [to] win their way onto our programs.”

McFarland echoed that argument Nov. 4, saying that DOD had not yet adopted open-systems architecture “at the level that I’d like to see.” Nonetheless, 75 percent of the acquisition strategy carried out across the department since July 2012 had involved open-systems architecture, she said.

DOD program managers are charged with implementing this shift in acquisition policy. McFarland said she instructs program managers to think about future threats and how technology will evolve when designing a given system. That mindset is especially important when it comes to cyber threats, many of which stem from open-source information, she said.

“In order for us to be able to address this threat, we have to jump ahead of it,” she added, calling for industry to develop more open-architecture cybersecurity systems.


Speaking later in the day at the same conference, Lt. Gen. Michael Williamson, principal to the assistant secretary of the Army for acquisition, logistics and technology, offered an illustrative example of the cost savings possible from open-architecture systems.

Williamson said he heard recently from an Army program manager who wanted to upgrade software for a control system and was confronted with 11 software languages, making the situation unnecessarily and inordinately complex.

“The challenge for us has been the cost associated with not only the software development, but also the software maintenance and the post-production software cost,” he said. “And as a service, it’s unaffordable for us to continue down a path where we have hardware and software proprietary systems that cause us to spend a lot of money in post-production trying to maintain connectivity and interoperability.”

Sean Lyngaas is a staff writer covering defense, cybersecurity and intelligence issues. Connect with him on Twitter: @snlyngaas.


‘Computer Chaos’ at 20: Back where we started

By Paul Brubaker

Oct 24, 2014


Editor’s Note: Paul Brubaker worked for Sen. Cohen in the mid-1990s as a subcommittee staff director, and was deeply involved in drafting both the Computer Chaos report and the Clinger-Cohen legislation.


Twenty years ago this month, then-Sen. William S. Cohen issued his seminal investigative report entitled “Computer Chaos.”

That report, and the Maine Republican’s deep interest and commitment in fixing the longstanding systemic deficiencies in the government’s approach to buying technology, were the driving forces behind the Information Technology Management Reform Act, now recognized as the IT provisions of the Clinger-Cohen Act.

At a recent luncheon, I had the opportunity to discuss the progress, or lack thereof, with Cohen. I think it is fair to say that we were both dismayed and disappointed that almost two decades later, the government has failed to take full advantage of the tools provided by the law that bears his name.

What was most disappointing was our mutual realization that many of the deficiencies highlighted in the Chaos report are still plaguing the government’s inability to keep pace with advances in technology.

The Government Accountability Office, the inspectors general and congressional committees are still citing failures that can be traced to many of the findings in the report: inadequate planning; misunderstood and over-prescribed requirements; large unmanageable programs; skill set disconnects; cumbersome and antiquated regulations and practices; disconnect between acquisition and mission; failure to link technology investments to mission outcomes; inadequate business cases … and the list goes on.

We also seem to be spending millions and wasting time redocumenting the deficiencies and admiring the problem when in fact, many of these flaws could be addressed by refocusing on the key provisions of the Clinger-Cohen Act and actually implementing them.

Last year I watched in shock and horror as a senior agency executive spent seven figures on a McKinsey study to highlight the very findings outlined above, while many of the oversight responsibilities delegated to that executive under Clinger-Cohen were either neglected or poorly executed. Too often, it seems, government does not take sufficient care to hire people with the necessary knowledge, skills, abilities and traits to succeed.

Over the past decade we have watched successive major procurements continue to fail in virtually every major department. And after spending more than $800 million on a fatally flawed site, it is still not clear the holistic and tightly integrated underlying business processes will ever be properly automated.

Moreover, the oversight mechanisms established by the Office of Management and Budget to gain visibility into the health of IT spending are useless. At the core is the fact that OMB and the agencies have lost the plot. They have simply forgotten what Computer Chaos and the Clinger-Cohen Act were saying: “It’s the business processes, stupid; not the technology.”

Since passage of Clinger-Cohen, well meaning but tragically bureaucratic policy people have been at the core of prescribing implementation guidance for the provisions of the law — and they have simply gotten it wrong.

Additional laws and regulations have also added to the layers of non-value added bureaucracy that have bastardized Clinger-Cohen’s intent, and burdened the system to the point where everything from capital planning and investment control, architecture, security and business case development has turned into a compliance exercise. At the same time, those subsequent additions have stifled innovation and the rapid deployment of the types of technologies that can transform business cases and create billions of dollars in efficiencies.

One need only to look at the FedRAMP process to see my point. This is an extraordinarily cumbersome and bureaucratic response to a legitimate concern that is better handled through clear lines of responsibility and accountability. This process arguably has unnecessarily slowed down cloud deployment in the federal government and has created billions of dollars in opportunity cost and lost innovation, as infrastructure and cloud service providers have spent millions attempting to navigate a fluid, often unclear, and understaffed compliance exercise — one that, at its core, may not really even be achieving its objective.

There are other examples. Implementation of FISMA, E-Gov Act requirements, OMB A-130, OMB’s 25-point plan and other inflexible and complicated laws, regulations, policies and practices have all conspired to ensure that we live in a federal IT environment that is impossibly complicated.

It is at least as daunting today as it was 20 years ago, when Cohen set out to simplify the government’s approach to technology in a way that would allow agencies to take advantage of 21st century tools. Clearly, and sadly, we are back to the future.

Paul Brubaker is AirWatch by VMWare’s director for U.S. federal government.


Can DHS get it together?

By Sean Lyngaas

Oct 31, 2014

Twelve years after its creation, the Department of Homeland Security is at a crossroads in how it handles its ever-evolving cybersecurity mission. On the one hand, the department says it lacks the legal authority to tackle the subject, and it struggles to hire and retain cybersecurity experts. On the other hand, former DHS officials say key cybersecurity programs and the department’s ability to coordinate the response to cyber threats — internally, with other agencies and with industry — has markedly improved.

The department’s technical efficacy in cybersecurity might now rest on how well its separate programs of intrusion detection and Continuous Diagnostics and Mitigation can complement each other.

That mixed report card reflects the challenges of harnessing a big bureaucracy to defend federal civilian networks and the emphasis the department has placed on cybersecurity in recent years, although it has been part of the department’s mission since its inception.

John Cohen, who until April was the acting undersecretary for intelligence and analysis at DHS, said various facets of cybersecurity under the department’s charge, such as cyber intelligence and threat detection, have become better integrated than they were five or six years ago. Then, two of the department’s main cyber-related divisions — DHS’ Office of Intelligence and Analysis and the National Protection and Programs Directorate — suffered from stovepiping and subpar levels of communication. Threat analysis done by I&A was not very well integrated into the threat-detection activities that went on in NPPD, he said.

Coordination between the two divisions has come a long way, said Cohen, who is now chief strategy adviser at data-protection firm Encryptics. For starters, DHS’ intelligence office has a seat at NPPD’s hub for monitoring cyber threats, the National Cybersecurity and Communications Integration Center (NCCIC).

That closer intra-agency coordination was put to the test last year when President Barack Obama was considering airstrikes against the Syrian regime. Cohen said there was evidence that the Syrian Electronic Army, a hacking group sympathetic to Syrian President Bashar al-Assad, was considering retaliatory cyberattacks on U.S. assets if Washington struck Damascus.

The United States has since gone on to bomb the Islamic State in Syria but not Assad, and Cohen said the intelligence office shared cyber threat information coming from the Syrian Electronic Army with NCCIC, which in turn was able to give a more credible perspective of the threat to U.S. critical infrastructure. According to Cohen, that coordination would have been unlikely just a few years ago.

Rob Zitz, who was deputy undersecretary of preparedness at DHS from 2006 to 2007, said the department’s cyber capability in those years was somewhat fragmented because of bureaucratic growing pains and evolving technology.

DHS and cybersecurity through the years

The Department of Homeland Security was created in 2002 and began deploying Einstein — an intrusion-detection system designed to offer the government a snapshot of federal civilian networks — in 2005. A glance at the past 15 months, however, shows how rapidly the cyber landscape is shifting.

August-September 2013 — The Syrian Electronic Army, a hacking group aligned with President Bashar al-Assad, threatens cyberattacks on U.S. assets. DHS’ Office of Intelligence and Analysis and its National Protection and Programs Directorate show improved coordination with each other in responding to the threat.

December 2013 — Jeh Johnson is sworn in as DHS secretary. Since then, Johnson has shown a keen interest in having Congress pass multiple cybersecurity bills and lobbied them to do so in a recent opinion piece.

April 2014 — DHS’ U.S. Computer Emergency Readiness Team gets a big test from Heartbleed, an OpenSSL vulnerability affecting vast portions of the Internet. US-CERT issues an alert with mitigation advice to industry within 24 hours, but it reportedly takes DHS a week to get approval from some agencies to scan their networks for signs of the vulnerability.

August 2014 — Larry Zelvin resigns as director of DHS’ National Cybersecurity and Communications Integration Center, the department’s hub for monitoring cyber threats. He is one of several high-level cybersecurity officials to leave for the private sector in recent years.

October 2014 — Citing the need to respond more quickly to bugs like Heartbleed, the Office of Management and Budget announces enhanced authority for DHS to scan federal networks for acute cyber threats. DHS has long sought that authority.


For Zitz, who is now senior vice president of Leidos, the introduction of a vast intrusion-detection program called Einstein in 2005 was a turning point in the department’s prioritizing of cybersecurity.


Enter Einstein

Einstein is one of DHS’ primary weapons for defending federal civilian networks. The intrusion-detection system is designed to provide the department’s U.S. Computer Emergency Readiness Team (US-CERT) with a “snapshot of the health of the federal government’s cyberspace,” as a DHS description puts it. The program installs sensors at Web access points on federal agency networks and sifts through that data looking for vulnerabilities.

As of August 2013, according to an inspector general report published in March, NPPD had spent more than $321 million on intrusion-detection capabilities. (When asked for an updated tally of Einstein’s costs, a DHS spokesman referred FCW back to that figure.)

Einstein is now the tip of the spear in the U.S. government’s response to the most acute cyber threats. And yet its efficacy is evidently undercut by the department’s nebulous legal mandate to implement it. Deploying Einstein throughout the executive branch “has been significantly delayed by the lack of clear authorities for DHS,” said then-NCCIC Director Larry Zelvin in testimony before the House Homeland Security Committee in May. Zelvin, who left DHS in August and is now director of Citi’s Cyber Security Fusion Center, declined to be interviewed for this story. A DHS spokesman also declined to make current cybersecurity officials available for an interview.

Although DHS is responsible for guarding federal civilian networks, it needs permission from each agency, through a memorandum of agreement, to deploy Einstein on its network. That bureaucratic conundrum was on display in the government’s response to Heartbleed, an OpenSSL vulnerability that emerged in April.

Einstein was able to detect the bug’s threat to federal networks but, as Deputy Undersecretary for Cybersecurity and Communications Phyllis Schneck said recently, nearly a week passed before lawyers from various agencies could agree to allow DHS’ technical team to scan agency networks and mitigate the threat. A cybersecurity adviser on the Senate Homeland Security and Governmental Affairs Committee, one of several congressional committees with jurisdiction over DHS, said department officials were prompt in briefing committee members on the nature of the Heartbleed threat. But in this case, word of the threat got out much quicker than DHS could deploy Einstein to address it.

In July, the House passed the kind of legislation that senior DHS officials have long been calling for. The National Cybersecurity and Critical Infrastructure Protection Act would codify and enhance NCCIC as the hub for sharing threat information across sectors. The bill, which now sits before the Senate Homeland Security and Governmental Affairs Committee, will compete with several other measures for lawmakers’ attention during the lame-duck session this fall. However, a committee aide expressed optimism that bipartisan support for doing something on cybersecurity would help the bill’s chances.

Too big a piece of the puzzle?

Einstein is a central piece of DHS’ cyber defense. Indeed, some experts warn that it could be too central to the effort.

John Pirc, a former cybersecurity researcher at the CIA and until recently chief technology officer at IT testing organization NSS Labs, said he believes DHS might be making a mistake by relying so heavily on Einstein.

The kind of intrusion-detection systems Einstein uses are “typically myopically focused on exploits,” Pirc said. “If you have a pared-down list of known vulnerabilities or exploits…are those current or are those legacy? And the reason why that’s important is that the adversary is not always going to be using new techniques. They’re going to use old stuff…for the mere fact of trying to evade the system.”

Pirc argues that Einstein’s signature-based security technologies “only know what they’re being told to look for” and don’t address much of the encrypted traffic on networks. He said the program is helping the government improve its cybersecurity posture, but “where I think Einstein is falling short…is you’re using technology that is only solving a fraction of the problem.”

Ken Durbin, manager of Symantec’s Continuous Monitoring and Cybersecurity Practice, said it is important not to think of Einstein as a silver bullet for the government’s cybersecurity vulnerabilities but as one of an arsenal of weapons.

“I’ve heard several times that cybersecurity isn’t like finding a needle in a haystack. It’s like finding a dirty needle in a pile of needles,” he said. “And any tool that you can use to pull out some of those needles to reduce the scope of your search is effective and useful.”

Symantec, one of the largest information security firms in the world, was unsuccessful in its bid to have DHS use the firm’s data repository to feed into Einstein, but Durbin said he gained intimate knowledge of the program in pursuit of that work.

In separate interviews, Durbin and Zitz described Einstein as a foundational tool for threat detection that complements another pillar of DHS’ cybersecurity work: the Continuous Diagnostics and Mitigation program.


Congress established CDM as a risk-based approach to cybersecurity that uses sensors to detect weaknesses on agency networks and send alerts to local dashboards. Whereas Einstein addresses network traffic, CDM scans the endpoints of that traffic, such as servers and workstations, for vulnerabilities and secure configurations. Durbin said the two programs are symbiotic: CDM aggregates and correlates data that can be used to develop more security signatures for Einstein.

Zitz said DHS’ treatment of Einstein as just one piece of the cyber puzzle “is indicative of the maturation of…NPPD [as a place] where all of those pieces come together now.”

Still lacking manpower

Despite the rise in automated cybersecurity services, which can reduce the manpower needed for some security missions, complex programs such as Einstein and CDM require experts to carry them out. And DHS has at times struggled to hire and retain those experts. Cybersecurity professionals can earn significantly more money in the private sector than they can in government, and the work can require long and stressful hours on the job.

A recent front-page Washington Post article states that a high turnover rate among senior DHS cybersecurity officials has hampered the department’s work. From June 2011 to March 2012, five such officials left for the private sector, according to the Post. But Zitz rejected the notion of instability among the department’s cybersecurity leadership. He cited Ann Barron-DiCamillo, who has worked at US-CERT for two years and been its director since January 2013, as one example of continuity.


Can CDM change the game?

“I think you’ve got stability in the leadership,” Zitz said. “I think a continuing concern is more so the idea that the subject-matter experts and technical experts who are inside of government, who are performing cybersecurity duties — they are extremely valuable and sought after in the private sector as well.”

Here, again, Congress could help. On Sept. 18, the Senate approved a measure that would give DHS Secretary Jeh Johnson greater authority to hire cybersecurity professionals and pay them salaries commensurate with those of cybersecurity experts at the Defense Department. The average annual salary for cybersecurity professionals, public or private, is around $80,000, according to a recent Rand study, which cited 2012 data from the Office of Personnel Management. More than one-quarter of federal security employees earn $74,872 to $97,333, or somewhere near that industry average, according to the study.

Yet there is a roughly $155,500 ceiling for how much the government can pay cybersecurity professionals, while top private-sector jobs can offer several hundred thousand dollars in annual pay. As the Rand study notes, “Once professionals can command more than $250,000 a year, the competitiveness of the U.S. government as an employer suffers correspondingly.”

Regardless of any action Congress takes on cybersecurity hiring, private-sector IT experts will, in general, always earn more than their public-sector counterparts. But it is not always about the money. DHS recruiters hope their appeals to a sense of mission to protect federal networks in cyberspace will resonate as that mission grows clearer.


U.S. Pilots Say New Chinese Stealth Fighter Could Become Equal of F-22, F-35

By: Dave Majumdar

Published: November 5, 2014 12:53 PM • Updated: November 6, 2014 7:00 AM

China’s new Shenyang J-31 stealth fighter — making its debut next week at the Zhuhai international airshow — could eventually become more than a match for American stealth fighters in battle, several U.S. military and industry officials told USNI News.

The J-31 is China’s latest crack at developing a modern so-called fifth-generation stealth fighter — equivalent in ability to Lockheed Martin’s F-22 Raptor or F-35 Lighting II Joint Strike Fighter.

“They’re still in the glossy brochure phase of development, so they still look ten feet tall and bulletproof,” one senior U.S. fighter pilot familiar with the F-35 program told USNI News.

“I think they’ll eventually be on par with our fifth gen jets — as they should be, because industrial espionage is alive and well.”

Many suspect the J-31 is designed using technology stolen from the Pentagon’s nearly $400 billion Lockheed Martin F-35 Joint Strike Fighter program.

“They sure look like F-35 and F-22s don’t they?” one Air Force operational test pilot told USNI News.

The senior U.S. pilot familiar with the F-35 — who has extensive experience flying the Lockheed Martin F-16 Falcon — told USNI News the Chinese jet is now likely more than match for existing fourth generation non stealth American fighters like the Air Force Falcons, Boeing F-15 Eagles and the U.S. Navy’s F/A-18E/F Super Hornet.

“They’ll probably be a handful right off the bat for all of our fourth gen stuff,” the pilot said.

One former Air Force fighter pilot extensive experience with foreign hardware told USNI News potential air battles might be more about sheer number of jets the Chinese might be able to put into the air versus the superior training of U.S. pilots.

“I worry about numbers more than particular platforms,” the former fighter pilot said.

“I imagine their jets and their weapons are pretty good. Don’t know about the pilots or their capacity to employ.”

Further, the retired pilot noted, airshows are designed to show off weapons to potential buyers, but offer no real information about what jet can really do during an actual combat mission.


“Just remember that airshows are exactly that — airshows.” the former pilot said. “Airshows provide no real clue to capabilities. As such, airshows generally rely on spectacular maneuvers to garner attention without providing substance. No different from the F-15C or the F-22.”

One current Air Force test pilot told USNI News that it would be difficult to gauge just how good the Chinese jet will be.

“Overall at this stage they’re not [operational] so it’s hard, for anyone to truly make a reasonable assessment,” the test pilot said.

There are still many unanswered questions about how the Chinese will operate their aircraft and what kinds of weapons the jets will carry. Perhaps the most important question is how good are the Chinese radars and other sensors compared to their American equivalents.

“How well will organic sensors work to support those weapons?” the test pilot asked rhetorically.

For U.S. industry officials, the J-31’s debut at the Zhuhai airshow signals that the Chinese are planning on selling the jet on the open market.

“I would characterize the J-31 flying at the Chinese airshow as ‘incremental and measured,'” a senior industry official told USNI News.

“There have been some reports that the J-31 maybe be exported. If so, then showing it off makes even more sense to attract prospective buyers,” the official said.

The Chinese are making a lot of progress in developing their aerospace industry.

However, jet engines remain a weak spot for China.

“They have yet to field many of their “new” designs in any quantities,” the industry official said.

“Time will tell.”

Meanwhile, the Russia’s advanced Sukhoi Su-35 Flanker-E is also making it debut at the Zhuhai airshow.

The Chinese are reportedly interested in the purchase of 24 examples of the modernized Russian-built jet.

There has been much media speculation that the Chinese intend to reverse engineer the aircraft as they have with previous Flanker variants.

“I view the Su-35 buy as a conservative stop-gap measure while they wait for the J-20 and J-31 to enter service,” the industry official said.

“Gotta have aircraft to have an air force.”



John McCain Poised to Control Senate’s Defense Policy

The move will give the hawk an influential avenue to merge his two favorite roles: administration critic and legislative deal maker.


By Jordain Carney

November 5, 2014

One of the president’s chief critics could soon lead the Senate’s main military committee.

With Republicans gaining the majority in the upper chamber in Tuesday’s midterm elections, Sen. John McCain is widely expected to become the next Armed Services Committee chairman in January.

McCain, the Republican Party’s presidential nominee in 2008, has decades of experience in foreign policy and defense issues in the Senate, where he was first elected in 1986. He also served in the Navy, and he spent more than five years as a prisoner of war after his plane was shot down during the Vietnam War.

As committee chairman, McCain would have an influential role in spearheading defense policy from Capitol Hill. That includes the Senate’s version of the National Defense Authorization Act, an annual bill that outlines defense policy and tells the Pentagon what it can and can’t spend money on. He’ll also gain a megaphone to voice his frequent opposition to the Obama administration on military and national security issues.

On Tuesday, Pentagon spokesman Rear Adm. John Kirby downplayed any concerns over McCain gaining the top spot, but, at least publicly, the senator’s relationship with the Pentagon has been rocky.

For example, during the past year, McCain put a hold on Bob Work’s nomination to be deputy Defense secretary, called the administration “cowardly” for not providing arms to Ukraine’s military, and frequently criticized the strategy to combat the Islamic State of Iraq and Syria.

And his blunt style isn’t likely to change once he becomes chairman.

“With Senator McCain, what you see is what you get. It’s part of his charm and persona, that he is a maverick—he is an outspoken maverick—that tells it like it is to anyone of any party,” said Mackenzie Eaglen, a fellow at the Washington-based American Enterprise Institute think tank. “He wasn’t best friends with the Defense Department when the last president was in office either.”

Despite McCain’s penchant for straight talk, he’s also known for a willingness to work with his opponents and find a solution—for example, on immigration-reform legislation with the “Gang of Eight.” And there are at least two areas where he and top Pentagon officials agree that something has to change: budget cuts under sequestration and how the Defense Department buys what it needs.

Without action from Congress, the budget caps would return in October 2015, the start of the 2016 fiscal year. Under Obama’s five-year budget, the Pentagon projects that it will need more than $535 billion in fiscal 2016. But under the budget caps, the Pentagon is expected to receive less than $500 billion, leaving a roughly $35 billion budget gap.

McCain has been pressing for years to reverse or replace the cuts. And Eaglen called a budget deal—similar to the Ryan-Murphy agreement that eased budget cuts for fiscal 2014 and 2015—”a no brainer.” But Eaglen also acknowledged that completely undoing the sequester for the Pentagon is “much easier said than done.”

McCain is also expected to be outspoken on reforming the Pentagon’s acquisitions process—or how the department buys the programs and technology it needs. He released a report that outlined potential improvements to the system last month with current chairman Sen. Carl.


Turning UAVs into chemical, biological weapon detectors

Nov. 5, 2014


As chemical and biological weapons continue to be a risk for military personnel and civilians alike, the Pentagon is seeking new ways to detect them. A new demonstration RFI has the goal of developing UAVs that can detect the weapons.

The Thunderstorm Project seeks a battery-powered vertical-takeoff UAV that can fly or hover at altitudes up to 1,000 feet above ground level for at least 30 minutes. The desired UAV also:

Can be carried in a backpack or in a Humvee.

Can detect chemical and biological weapons and transmit the data to a receiver at least 1 kilometer away.

Can be operated by regular Chemical Biological Radiological and Nuclear (CBRN) personnel with minimal training.

Includes a ground station able to provide visual displays of the sensor readings.

Offers autonomous operations based on satellite imagery that is pre-loaded, or downloaded from cell phones or Wifi networks.

Can carry modular payloads capable of detecting G, H and V-type chemical agents, biological agents, and can conduct surveillance of natural flora.

Is able to collect and return samples for testing.


At this stage, the Deputy Assistant Secretary of Defense for Emerging Capability and Prototyping, Rapid Reaction Technology Office only wants to see technology demonstrations; the document is not a request for proposals.

On the flip side, the RRTO is also looking for technologies to counter small, man-portable, off-the-shelf UAVs carrying chemical or biological warheads. The Pentagon wants sensors that can detect and identify these small WMD carriers, plus kinetic and non-kinetic weapons to destroy them.

Thunderstorm, which is a technology demonstration program, will hold an industry day in the second quarter of 2015 at Camp Shelby, Mississippi.


Children’s Hospital, AFRL explore fatigue in airmen

Posted: 4:39 p.m. Wednesday, Nov. 5, 2014    

By Barrie Barber

DDN Staff Writer


Researchers will peer deep inside the brain to explore how electrical stimulation eases fatigue in airmen, researchers say.

Dayton Children’s Hospital and the Air Force Research Laboratory collaborated Wednesday on a 3T Magnetic Resonance Imaging machine to begin find answers for the first time.

AFRL has experimented with attaching electrodes to the brain of volunteers to find out how well the stimulation improves the alertness and performance of fatigued airmen.

The nearly $2 million MRI machine at Dayton Children’s has a stronger magnetic field to get detailed images of how the brain responds when specific areas are stimulated, said Dr. Elizabeth Ey, a pediatric radiologist at the hospital. The images are twice as detailed as other scanners, the hospital said.

“It provides the most amazing view of brains,” said Vicki Giambrone, Dayton Children’s vice president of strategic partnerships.

Aaron Miller, director of the Alliance for Human Effectiveness and Advancement (AHEAD) at Wright State University, brought the two sides together. Wright State research Institute is also collaborating in the research.

Miller said the Air Force has traveled outside the region to get access to equipment to obtain the same data.

The stimulation could make imagery analysts, cyber and unmanned vehicle operators more alert with long hours of staring at computer screens, and research findings suggest students could train up to twice s fast or more, Air Force researchers have said.


GAO: U.S. Air Force Needs More Info Before Committing to Disaggregation

By Mike Gruss | Nov. 5, 2014

WASHINGTON — A new report from the U.S. Government Accountability Office says that distributed satellite constellations, a concept currently in vogue with Air Force space leaders, may solve some problems for the Defense Department, but would also create new difficulties.

A transition to smaller satellites could lead to increased costs at a time when budgets are tight, would require a reimagination of launch possibilities and would face cultural obstacles within the Defense Department, where such a shift is not viewed as urgent, the GAO said.

The report, “Additional Knowledge Would Better Support Decisions about Disaggregating Large Satellites,” provides a more nuanced look than a white paper the Air Force published last year arguing disaggregation would help keep the United States ahead of adversaries in a quickly evolving space environment.

The GAO identified several benefits, including the potential for improved acquisition process, but said the Defense Department needs more information before the Air Force makes decisions about its future satellite architecture.

“Until more knowledge is gained, disaggregation will not only remain inconclusive, but poorly informed decisions could be made in the interim,” the report said.

Air Force Space Command is currently leading a series of studies on the service’s next-generation technologies. Specifically, the Air Force is expected to wrap up studies on its missile warning and protected communication satellites in the next several months. Disaggregated constellations have been part of every study on future satellite programs, the report said.

But the GAO said the Air Force needs more information specifically on how the new approach would work from an operational perspective.

“While technology demonstrations are providing an avenue for gaining knowledge about disaggregation, they have been limited, concentrating more on technical than operational feasibility,” the report said. “Focusing more on operational feasibility would help to empirically quantify the effects of disaggregation and address implementation barriers.”


Air Force leaders have pointed to improved resilience as a major advantage of disaggregation. But when the GAO first started studying the topic, the Defense Department did not have a definition for resilience or any metric to gauge improvement.

“DoD does not have common measures for resilience — a key space system consideration — which may limit the effectiveness of these assessments,” the report said, referring to the Air Force’s analyses of alternative space architectures.

The call for metrics has been a point of frustration for Air Force Gen. John Hyten, the commander of Space Command. At a speech in Huntsville, Alabama, in August, Hyten complained about Pentagon pressure to quantify resilience.

“The whole concept of resiliency is a very frustrating concept to me,” he said. “Folks in the Pentagon keep coming up to me and saying, ‘General Hyten, what is the number for an AEHF resiliency constellation? Is it 95 percent? 90 percent?’ I don’t even know what that means, but they’re trying to put a number on it I guess because we’re space people and we’re geeks and we put numbers on everything, but I don’t care what the number is. I want a resilient warfighting construct.”

The cost of the transition to a disaggregated architecture may be expensive, the report said. While smaller satellites might spur more new entrants subsequently lowering costs through competition and saving money by launching on smaller, less expensive rockets, the report said, a disaggregated portfolio could also lead to a more complex and more expensive ground system, more frequent launches and therefore higher launch costs, and higher overall costs for nonrecurring engineering, the report said.

In addition, the use of smaller satellites would require a new emphasis on different launch capabilities.

“Disaggregation can enable DoD to reduce launch costs because satellites would no longer require the heaviest, most expensive launch vehicles to get to orbit,” the report said. “However, the vehicles DoD primarily relies on to deliver its satellites to orbit are designed to carry heavier satellites. Without changes to the current architecture, DoD may well find itself having to rely on launch vehicles that are more capable and expensive than needed.”

In addition, the GAO said the Pentagon may be slow to adopt such a drastic departure from current standards.

“DOD’s culture has generally been resistant to changes in space acquisition approaches and that fragmented responsibilities have made it very difficult to coordinate and deliver interdependent systems,” the report said.

In the white paper, the Air Force advocated that disaggregation is “an innovative opportunity to stay ahead of our adversaries, to change their targeting calculus, and to mitigate the effects of a widespread attack on our space assets,” the report said. “In addition, resilience serves as a deterrent, which may be the best way to preserve our capability by avoiding an attack.”

The GAO report also questioned whether adversaries would be more willing to attack smaller satellites that may be viewed as less important and therefore carry less retribution.

Is a “Missile Truck” the Solution to One of the Scariest Wargames Ever?

By Dustin Walker

November 5, 2014

In August 2008, the RAND Corporation joined military leaders at Hickham Air Force Base in Hawaii for a wargame entitled “Pacific Vision.” The exercise was meant to identify the capabilities U.S. Pacific Air Forces (PACAF) would need to prevail against potential threats in the Asia-Pacific region through 2016. At least one of the scenarios examined in the wargame was truly frightening.

As Paul Scharre of the Center for a New American Security recently summarized in an op-ed for The National Interest:

[The RAND study] analyzed a U.S.-China air war over Taiwan made the bold assumption that every air-to-air missile fired from a U.S. F-22 hit a Chinese fighter (100 percent kill rate) and that every Chinese missile missed the U.S. F-22s (0 percent kill rate). In their simulation, the United States still lost the fight. The F-22s ran out of missiles and the Chinese fighters were able to go after vulnerable tankers and command and control aircraft. A far more detailed simulation the following year showed the same results. Even though U.S. F-22s were pegged with a 27-to-1 qualitative advantage over Chinese fighters, their diminished numbers and the fact that they had to fight from long range meant the Chinese had vastly superior numbers and won the fight.

The RAND study emphasized that improvements in forward basing infrastructure were necessary for U.S. airpower to achieve its objective effectively.

But in a new report from CNAS, “Robotics on the Battlefield Part II: The Coming Swarm,” Scharre offers another solution to mitigate China’s numerical advantage – an unmanned “missile truck” fighter:

An uninhabited “missile truck” that brought additional air-to-air missiles to the fight to supplement human-inhabited F-22s could tip the scales back in the United States’ favor. Such an aircraft need not have the full performance characticeristics of a 5th or 6th generation fighter aircraft. It would only need to have sufficient stealth to get close enough to launch its missiles against Chinese fighters. If it then perished in the engagement, that would be acceptable provided it took a sufficient number of enemy fighters with it. It would still have accomplished the mission. The uninhabited aircraft would not need advanced autonomy, merely enough to fly in a straight line under a human’s control and sufficiently robust communications links for the human-inhabited F-22s to pass targeting data. All targeting and firing decisions would be made by the F-22 pilots. If such an aircraft could be built at relatively low costs, this uninhabited “loyal wing-man” could be a tremendous force multiplier for U.S. human-inhabited fighters.

In general, Scharre has argued that uninhabited and autonomous systems will be essential to maintaining U.S. military dominance. As he wrote recently for RealClearDefense:

Humans will still fight wars, but new technology will give combatants, as it always has, greater standoff from the enemy, survivability or lethality. Exploiting those advantages will depend principally on the ability to uncover the most innovative applications of robotic swarms, which will require not only increased resources but also an aggressive campaign of experimentation and technology development. Many of the underlying technologies behind increased autonomy are driven by commercial sector innovation, and as a result will be available to a wide range of state and non-state actors. In a world where some of the most-game changing technologies will be available to everyone, uncovering the best uses of that technology – and doing so urgently – will be vital to sustaining American military dominance.

Dustin Walker is the Editor of RealClearDefense.


Harsh Winter Outlook Made a Bit More Dire by Siberia Snow

By Brian K. Sullivan

Nov 6, 2014 12:07 PM ET


Remember how evidence was mounting last month that early snowfall was accumulating across Siberia? And remember how there’s a theory that says this snowfall signals a cold winter?

So in the two and a half weeks since, the news for the winter-haters has, unfortunately, only gotten worse.

About 14.1 million square kilometers of snow blanketed Siberia at the end of October, the second most in records going back to 1967, according to Rutgers University’s Global Snow Lab. The record was in 1976, which broke a streak of mild winters in the eastern U.S. In addition, the speed at which snow has covered the region is the fastest since at least 1998.

Taken together they signal greater chances for frigid air to spill out of the Arctic into more temperate regions of North America, Europe and Asia, said Judah Cohen, director of seasonal forecasting at Atmospheric and Environmental Research in Lexington, Massachusetts, who developed the theory linking Siberian snow with winter weather.

“A rapid advance of Eurasian snow cover during the month of October favors that the upcoming winter will be cold across the Northern Hemisphere,” Cohen said in an interview yesterday. “This past October the signal was quite robust.”

There are a few steps to get from the snows of Siberia to the chills in New York City.

Cold air builds over the expanse of snow, strengthening the pressure system known as a Siberian high. The high weakens the winds that circle the North Pole, allowing the cold air to leak into the lower latitudes. The term Polar Vortex actually refers to those winds, not the frigid weather.


Chance Discovery

Cohen said he first noticed the relationship between the Eurasian snow cover and larger weather patterns while doing post-doctoral work at the Massachusetts Institute of Technology in the 1990s.

It came about by chance because the original assignment was to look at the North American snow cover, Cohen said. He changed it to Eurasian and “when we investigated further it turned out it was Eurasian snow cover that was the dominant influence.”

Last year, 12.85 million square kilometers covered Eurasia at the end of October. By January, waves of frigid air were pummeling the U.S. Prices for natural gas, a heating fuel used by half of American households, rose to a five-year high in February.

“The big early snowbuild will definitely set things up for a cold back half of the winter,” said Todd Crawford, a meteorologist at commercial forecaster WSI in Andover, Massachusetts.


October Snow

When the snow across Eurasia began piling up again this October, many forecasters and energy traders began to take note.

By Oct. 13, Cohen had calculated, 12.2 million square kilometers of Eurasia were covered by snow, compared with 10.8 million the same day last year.

Not everyone is convinced.

Mike Halpert, acting director of the U.S. Climate Prediction Center in College Park, Maryland, told reporters last month that there wasn’t enough historical information to make the Siberian snow rule a useful tool.

The climate center’s forecast calls for a greater chance for a mild winter along the West Coast and then across the northern U.S. states into New England.


Energy Clients

Matt Rogers, a commercial forecaster in Bethesda, Maryland, said that while the snow accumulation isn’t a perfect predictor, he does keep an eye on it to help make seasonal calls for energy clients.

“I believe it increases the chances of a cold winter, but it does not guarantee it,” said Rogers, president of Commodity Weather Group LLC in Bethesda, Maryland. “I’ve seen some failures of it before.”

Rogers said for the U.S. East Coast, it’s important to watch the North Atlantic Oscillation, which often acts in tandem the Arctic one.

The oscillation is a shift of high and low pressure systems over the ocean that can influence storm tracks and the location of the jet stream, and affect the weather over the eastern U.S. and western Europe.


For Cohen, this year presents a good opportunity to test the theory. After seeing this much snow pile up, a balmy January and February in the eastern U.S. would undercut the thesis.

Cohen said he has started a blog to track the changes in the Arctic oscillation and what happens with temperatures this season.

“It is an important year,” Cohen said, “because it is a big number for the snow.”


The Military Wants New Technologies To Fight Drones

Patrick Tucker

November 6, 2014

At the end of October, the Pentagon put out a request for information, or RFI, for new technologies “countering” commercial drones that are armed with chemical, biological, or massively destructive weapons. Specifically, the request asks for ideas on “emerging technologies, technical applications and their potential to counter a low-cost, small/man portable, commercial off-the-shelf unmanned aerial system (UAS) carrying a chemical and/or biological WMD payload.” And that includes “electronic systems that can interdict defeat or deny hostile use of UAS” and “systems providing the capability to intercept and neutralize the UAS. Both kinetic and non-kinetic solutions are encouraged and should cover both” the United States and global applications. The project, called Thunderstorm, will feature a technology demonstration in the second fiscal quarter of 2015 at Mississippi’s Camp Shelby.

The first portion of the request asks for ideas on technology to outfit drones with chemical weapons detectors. It’s another area of intelligence, reconnaissance and surveillance where flying robots that don’t sleep or blink or breathe could easily outperform humans. The biggest technological hurdle here is simply shrinking the sensors to the point where they can fit on relatively small drone but still be robust enough to pickup the most popular poisons.

The RFI is looking for the drone to be able to detect a wide variety of nerve agents like Sarin as well as industrial compound toxins and even flowers and wildlife, or “persistent and natural flora (providing biological surveillance on current and emerging flora).” It should be able to reach an altitude of 1000 feet, fly for at least 30 minutes without recharging and be able to actually collect samples and fly them back to defense department labs for analysis.

It’s the second portion of the RFI that speaks to a fast growing military worry: how to down the thousands of potentially dirty drones that could be swarming toward the United States in the decade ahead.

As many as 30,000 unmanned aerial vehicles will be darkening America’s skies by 2020, according Todd Humphreys of the University of Texas at Austin. They could be engineered to carry dangerous payloads or simply used as weapons. It’s an area that has attracted increasing Pentagon focus. A December 2012 meeting at Oklahoma’s Fort Sill, for instance, brought 120 representatives together from the services to discuss methodologies for countering UAVs with talks and discussions scheduled to continue. Around the same time, the Naval Postgraduate School disclosed the existence of a classified research effort called Project Jason to “characterize and understand the threats posed by swarms of UAV,” according to the school’s website.

In a request for proposals from February, the military asked for ideas on counter UAV technologies to detect drones that were both large and even “micro-sized.”

Some of the capabilities featured in the most recent request include “sensors, software, or computers to detect and classify Command, Control, Communications, Computers, Collaboration and Intelligence (C5I) and sensor systems that facilitate rapid detection, identification and classification of UAS targets.”

As many as 30,000 unmanned aerial vehicles will be darkening America’s skies by 2020.

How easy is it to detect drones? Simple enough that you can do it at home, in some cases.

A company called Domestic Drone Countermeasures will sell you a kit consisting of “three boxes: a Primary Command and Control Module and two Detection Sensor Nodes. These three boxes create a mesh network that can triangulate moving transmitters.” You can increase the size the mesh network to detect the drone radio signals over a wider area, up to 200 feet. ( is also worth checking out.)

But detecting the presence of a drone is different from tracking a drone’s location, movement, or classifying its type or payload.

Achieving that level of detection capability requires more advanced radar equipment that military is looking to shrink down and make more mobile. Last September, the Air Force modified the Joint Surveillance Target Attack Radar System aboard one of its JSTARS test jets to perform a massive counter-UAV exercise. JSTARS is “an airborne system designed to find and track ground and maritime targets, as well as slow-moving fixed-wing aircraft and rotating antennas.”

Here’s how they described the test. “Once JSTARS…called ‘on-station,’ UAVs launched from the naval air station in intervals and flew patterns over water. The crew tested the new capabilities of the radar by attempting to detect and track the UAVs…They did not have advanced knowledge of the take-off times or patterns of flight of the target aircraft.”

Drone detection and tracking on the ground comes in various forms. One of the more interesting is a radar truck called Green Rock, developed by Israeli defense contractor Israel Aerospace Industries. The Green Rock tracks low-flying UAVs as well as rockets and mortars via Phased-Array Pulse Doppler Radar.

If you can detect the identity of the drone and find it, how do you get it down?

Today, the primary drone counter measure for disrupting drones is communication and radar jamming, hurting the ability of the drone to communicate with its operator. The military is paying Raytheon $10 billion to build what the military is calling the Next Generation Jammer.

If detection tech can be made small and cheap enough, the ultimate weapon against swarms of cheap drones will likely be… swarms of cheap drones.

But future counter-drone technologies include the various direct energy weapons making their way onto trucks and planes and particularly ships and boats. The Office of Naval Research, ONR, has made this a particular focus.

“We can expect that our adversaries will increasingly use UAVs and our expeditionary forces must deal with that rising threat,” Col. William Zamagni, said in a June press release announcing ONR’s award in the Ground-Based Air Defense Directed Energy On-the-Move program, GBAD, program.

If detection tech can be made small and cheap enough, the ultimate weapon against swarms of cheap drones will likely be… swarms of cheap drones. It’s a research project underway at the Naval Postgraduate School, where in 2012 professor Timothy Chung said he was looking to put together a massive swarm on swarm drone challenge. This effort was to culminate in two teams, each with 50 Harpy style small drones “Duk[ing] it out over the skies of Camp Roberts,” in California in 2015.

In the meantime, ideas for the Thunderstorm project are due in on Nov. 26.


Rasmussen Reports

They Told Us: Reviewing Last Week’s Key Polls

Bottom of Form

Saturday, November 08, 2014

Now America’s in the eye of a hurricane. The storm of Election 2014 is over, and we have a little calm until the new Republican Congress and our lame duck president face each other down beginning in January.

There were 36 Senate races this fall. We projected a GOP takeover of the Senate on Monday. Here’s a state-by-state post-election look at how we did in our Senate surveying.

The Louisiana Senate race is the only remaining question mark. It will be settled by a December 6 runoff because no candidate cleared the 50% mark. We’ll be surveying that race soon.

Where does the president go from here? The Republican takeover of the Senate is seen by many as a repudiation of his policies. There’s no doubt that voters don’t approve of much of what they see out of the White House these days.

Heading into the midterm elections, Obamacare, the president’s major achievement, remained unpopular, with most voters still convinced that it will worsen health care in America.

The president in a press conference the day after the elections signaled that he still intends to grant amnesty to countless illegal immigrants if Congress doesn’t act. Most voters oppose such an amnesty because they want to see stricter border control that ends illegal immigration first.

It doesn’t help the president’s cause that barely half of voters now believe most immigrants to this country come here to work hard, support their family and pursue the American Dream.

The president’s daily job approval rating remains low. His monthly job approval stayed at 47% in October, showing no change from September.

Just 27% of voters think the country is heading in the right direction, but that was before Election Day. It’ll be interesting to see if that number moves noticeably up or down because of the outcome of Tuesday’s voting.

On the eve of the midterm elections, Republicans moved to a 43% to 41% lead on the Generic Congressional Ballot, their highest level of support since November 2012. But the two parties have been separated by just two points or less for most weeks this past year.

With the ink scarcely dry on Tuesday’s ballots, most voters already expect the new Republican majority in Congress to let them down. After all, only four percent (4%) think most politicians keep their campaign promises.

There’s no doubt that Americans are a pessimistic lot these days. Just 40% of voters think U.S. elections are fair.

Americans also elected governors in 36 states on Tuesday. Here’s our before Election Day assessment  and our after Election Day state-by-state look at how we did in our surveying.

The lopsided Republican gains in both the Senate and governor’s races can be explained in larger part by turnout. Republicans voted; many Democrats did not. We found out after the election that 56% of GOP voters felt “compelled” to vote in this year’s election, compared to 43% of Democrats.

This election cycle also appears to be a repudiation to some degree of negative politicking. Most voters rejected the use of the so-called “war on women”  and of outlandish racial complaints, known more familiarly as “playing the race card,” as political ploys more than genuine problems.

Fifty-seven percent (57%) of Americans told us that negative campaign ads actually make them less likely to vote for the candidate who produced the ad.

The economy remains the number one issue on voters’ minds. As of yesterday, just 23% of consumers and 29% of investors rated the economy as good or excellent.  Will the new Congress and the president be able to change those long-standing perceptions?

New job figures released by the federal government on Friday show the unemployment rate dropping to 5.8 percent. Our surveying earlier in the week found that 35% of Americans believe the job market is better than it was a year ago, and only 21% think unemployment will be higher in a year’s time. That’s more optimism than we’ve seen in several years.

Americans continue to believe, however, that private sector employees work harder and get paid less than government workers

An overwhelming majority also say it’s more common for grown children to live with their parents these days, and most think that’s a bad thing for families and the economy.

Meet Rasmussen Reports’ new on-air spokesman, Leon Sculti. 

In other surveys last week:

— Americans say they would choose to be Superman if they could be any superhero, and if they could have one super power, it would be the ability to see into the future.

— Eighty percent (80%) of Americans trust their doctor, unchanged from past surveying. But 32% think most doctors overprescribe drugs to their patients.

— Even though voters still generally regard Medicare as a good deal for working Americans, only a third are willing to pay more in taxes to allow it to continue as is.

— Just over a third of Americans still say they drink alcohol at least once a week. 

— Nearly half of pet owners (49%) own a dog, while 23% own a cat

— Daylight Saving Time ended last weekend for this year, and many Americans question whether it’s really needed anymore.































From → Uncategorized

Comments are closed.

%d bloggers like this: