Skip to content

August 4 2012

August 6, 2012

4Aug2012

Newswire

 

Legislation to prevent government shutdown will wait until September

The Hill

By Erik Wasson – 07/30/12 10:50 AM ET

 

House Republicans have no plans to move forward this week on a catch-all spending bill to avert the government shutdown, sources confirmed Monday.

Congress will instead leave town for a five-week recess without voting on a continuing resolution or even introducing it, these sources said.

“No CR this week,” one GOP aide said, adding that there is “zero chance” of text appearing this week.

A new spending bill wasn’t included on House Majority Leader Eric Cantor’s (R-Va.) weekly schedule released late Friday and sources said that there are no plans to change that.

Without a resolution to keep the government operating, the government would shut down after September 30. Both sides have reason to avoid that, but they won’t get to a vote until closer to the deadline day.

After the five-week recess, the House will have only eight legislative days in September in which to complete its work before the election.

Momentum began to build last week for a six-month continuing resolution at current spending levels. The bill would likely adhere roughly to the $1.043 trillion projected spending rate for 2012.

This is $4 billion less than the Budget Control Act top-line number for 2013 but $15 billion more than the level set in the House-passed budget resolution.

The House Appropriations Committee has been crafting its 12 annual bills based on the House budget number, while the Senate has been using the BCA figure. The CR would push a final resolution on the spending level differences and on dozens of policy riders to the spring of 2013.

Conservatives had been pushing for a longer-term CR in order to delay final spending decisions on the 12 separate annual appropriations bills until next year. They are betting on a GOP controlled Senate and White House being able to make quick cuts to discretionary spending.

 

Senate Majority Leader Harry Reid (D-Nev.) also spoke favorably of a bill that lasts beyond the December lame duck session. Analysts see Reid’s move as a calculation that the CR is best left out of a fight over expiring tax rates and automatic sequester cuts, a fight where Democrats think they have an advantage.

Source:

http://thehill.com/blogs/on-the-money/appropriations/240981-no-action-on-major-spending-bil-before-recess

 

 

 

NSA chief asks hackers at Defcon for help securing cyberspace

NSA Director General Keith B. Alexander called the Defcon attendees the world’s best cybersecurity community


Network World

By Lucian Constantin, IDG News Service

July 28, 2012 01:05 PM ET

 

National Security Agency Director General Keith B. Alexander addressed the attendees of the Defcon hacker conference in Las Vegas on Friday and asked for their help to secure cyberspace.

“This is the world’s best cybersecurity community,” said Gen. Alexander, who also heads the U.S. Cyber Command. “In this room right here is the talent our nation needs to secure cyberspace.”

Hackers can and must be part, together with the government and the private industry, of a collaborative approach to secure cyberspace, he said.

Hackers can help educate other people who don’t understand cybersecurity as well as they do, the NSA chief said. “You know that we can protect networks and have civil liberties and privacy; and you can help us get there.”

Gen. Alexander congratulated the organizers of Defcon Kids, an event dedicated to teaching kids how to be white-hat hackers, and described the initiative as superb. He called 11-year-old Defcon Kids co-founder CyFi to the stage and said that training young people like her in cybersecurity is what the U.S. needs.

The NSA director stressed the need for better information sharing between the private industry and the government and noted that the Congress is currently debating legislation to address this.

NSA’s and U.S. Cyber Command’s roles are to protect the nation from cyberattacks and foreign intelligence, Gen. Alexander said. The issue is that if you don’t see a cyberattack you can’t defend against it and at the moment, the NSA has no insight if Wall Street is going to be attacked, for example, he said.

Gen. Alexander pointed out that if the industry could share some limited pieces of information from their intrusion detection systems in real time, the NSA could take it from there.

The next step from information sharing is jointly developing standards that would help secure critical infrastructure and other sensitive networks, he said.

He encouraged hackers to get involved in the process. “We can sit on the sidelines and let others who don’t understand this space tell us what they’re going to do, or we can help by educating and informing them” of the best ways to go forward.

“That’s the real reason why I came here. To solicit your support,” he said. “You have the talent. You have the expertise.”

At the Aspen Security Forum conference on Thursday, Gen. Alexander revealed that there’s been a 17-fold increase in cyberattacks against U.S. infrastructure between 2009 and 2011, the New York Times reported.

The hacker community has built many of the tools that are needed to protect cyberspace and should continue to build even better ones, he said during his keynote at Defcon. He gave the example of Metasploit and other penetration testing tools.

 

“Sometimes you guys get a bad rap,” he said. “From my perspective, what you’re doing to figure out vulnerabilities in our systems is great. We have to discover and fix those. You guys hold the line,” he said.

Gen. Alexander’s presence at Defcon was a rare event. Before introducing him to the stage, Defcon founder Jeff Moss, who is the chief security officer of ICANN and a member of the U.S. Homeland Security Advisory Council, revealed that he has tried for the past 20 years to get a high-ranking NSA official to speak at the conference.

“Like magic, on our 20th anniversary and NSA’s 60th anniversary it’s all come together,” Moss said. “For me it’s really eye-opening to see the world from their [NSA’s] view.”

 

DoD Sees 4 Sequestration Scenarios


1-Year Budget Deal Would Buy More Time

Defense News

Jul. 29, 2012 – 12:13PM |

By MARCUS WEISGERBER | Comments

 

Increasingly concerned that time is running out for the U.S. Congress to avoid $500 billion in automatic defense cuts, the Pentagon is assessing all options, including the possible implications of a one-year, $100 billion government-wide, “mini-sequester” deficit-reduction deal, Defense Department and industry sources said.

Obama administration officials are adamant they are not planning for sequestration and continue to urge lawmakers to pass a long-term debt-reduction deal that would avoid automatic cuts entirely, but worries are mounting that lawmakers will be unable to strike a deal to cut revenues while increasing taxes before the November elections.

During a closed-door July 23 meeting with two dozen defense CEOs and other executives, Defense Secretary Leon Panetta detailed four possible scenarios facing the department in the latest in a series of high-level meetings between DoD’s top official and industry leaders. He was joined by Frank Kendall, the Pentagon’s acquisition chief; Bob Hale, comptroller; and Brett Lambert, the industrial policy chief.

At the meeting, Panetta said he would rather see a more comprehensive, long-term deficit-reduction deal, but suggested the most politically practical solution may be a short-term agreement of one or two years, according to participants.

The four scenarios they’re considering are:

• Congress does not act and sequestration happens.

• During the lame-duck session of Congress after the November elections, a plan is constructed to thwart sequestration.

• Members of Congress come up with a $1.2 trillion cut to avert sequestration before the election.

 

• Congress inserts language into a continuing resolution that delays sequestration another year or two when there is a less-heated political environment, but the government implements the first and perhaps second year of cuts, which some refer to as the “mini-sequester.”

As in past meetings, Panetta and top executives discussed how sequestration could result in massive job cuts and how other Cabinet agencies should pressure Congress to modify the Budget Control Act of 2011 — the law that mandates sequestration to lower the U.S. deficit — should lawmakers not be able to agree on alternate ways to reduce the country’s debt.

“Panetta is right now the only Cabinet official that is doing anything to try to ward off what is going to be a total disaster for all industries, not just the defense industry,” said one official who attended the meeting.

“Secretary Panetta and industry are 100 percent on the same sheet of music on the fact that everybody’s got to do everything we can to basically put pressure on the Congress to fix the problem,” the official said.

Panetta has repeatedly said full sequestration would be devastating to DoD and the defense industry.

There is a “strong sense” in the Pentagon that a long-term deficit-reduction deal will not happen, another source said.

Republican and Democratic lawmakers are not expected to act on a debt compromise — specifically one that could include tax increases — prior to the elections.

“The department’s position is clear. We want a budget deal — one that’s balanced — so that we can avoid the devastating consequences of sequestration,” Pentagon Press Secretary George Little said. “The stakes are too high for a nation facing a serious fiscal crisis.”

Sequestration calls for $1.2 trillion in mandated cuts across the federal government over the next decade. The Pentagon’s share of those cuts is about $500 billion.

The 2013 share of the 10-year sequestration cuts totals about $109 billion across the entire federal government.

“Clearly it would have less impact than a trillion-dollar hit, [should] sequestration be fully implemented,” said Gen. Norton Schwartz, Air Force chief of staff, during a July 26 taping of “This Week in Defense News” when asked about the impact of a one-year deal.

“We’re not privy to those conversations and if our target is an additional $100 billion, we will make choices based on the priority of our capabilities, those that are in demand from our combatant commanders, and so on,” Schwartz said. “The only thing that is very important is maintaining balance — balance between readiness of a smaller force and making sure that we don’t completely mortgage the future.”

The federal government has faced numerous shutdown threats over the past year as lawmakers in both chambers of Congress have been unable to agree on a plan to lower the federal deficit.

Since November, Pentagon officials have been frustrated with the lack of urgency Congress has shown to come up with a plan to avert sequestration. Panetta has routinely called for Congress to put everything on the table to address the debt, including defense and domestic spending, entitlements and revenues.

For months, DoD officials have insisted they are not planning for sequestration and would not start planning until directed to do so by the White House Office of Management and Budget.

People close to Panetta said he is frustrated that the Pentagon has been forced to spend so much time thinking about sequestration, especially as U.S. military operations around the world have intensified in recent months.

 

The Pentagon has been closely monitoring fighting in Syria, Iran’s continued pursuit of nuclear weapons and North Korean militaristic threats.

 

Lawmaker estimates 89,000 Defense civilians would lose jobs under sequestration

 

By Amanda Palleschi

July 27, 2012

 

A House Republican fears the loss of more than 100,000 civilian Defense Department jobs if sequestration takes effect and if a Senate committee proposal becomes law.

Rep. Randy Forbes, R-Va., chairman of the House Armed Services subcommittee on readiness, estimated this week that 89,000 Defense civilian jobs could be eliminated if across-the-board budget cuts take effect in January. He based the estimate on the assumption of an 11.3 percent cut to the department’s accounts, excluding military personnel.

Forbes expressed concerns that the department is not sufficiently prepared for this large reduction to its civilian workforce and is not adequately planning for sequestration, the automatic cuts mandated by the 2011 Budget Control Act, if Congress cannot reach an agreement on long-term plan to significantly reduce the deficit.

“Nobody wants it to happen, most especially not me,” Forbes said at a subcommittee hearing this week. “We have been talking about it for a while, but it appears there is little to no planning associated with this legislative mandate.”

Frederick Vollrath, principal deputy assistant secretary of defense for readiness and force management, told the panel that mass layoffs are not imminent. Congress would get at least 45 days’ notice before any cuts and affected employees then would receive 60 days’ notice. Also, before reductions in force decisions are made, the Pentagon would have to review any cuts imposed by sequestration, and that could take several months, he said. No such analysis is currently under way. Vollrath told the panel he was not aware of whether the department had plans to conduct a full analysis of civilian force reductions in the event of sequestration and any decisions to analyze cuts would have to start with the Defense secretary.

“We recognize that we operate in a dynamic and changing environment and therefore must retain the flexibility to adapt our workforces accordingly,” he said. “This includes incentivizing early retirements, strategic pauses in hiring and normal attrition.”

Forbes called the department’s lack of a pre-emptive analysis “baffling,” since the Government Accountability Office has suggested that step is crucial before cuts take place.

The Senate Armed Services Committee-passed version of the fiscal 2013 Defense authorization bill also recently included language that would cut funding for civilian personnel by approximately 5 percent. Combined with potential cuts under sequestration, Forbes said, more than 128,000 positions could be eliminated. Forbes also said Congress recently was notified that Defense extended its civilian personnel cap from its 2010 levels into fiscal 2018.

 

http://www.govexec.com/defense/2012/07/lawmaker-estimates-89000-defense-civilians-would-lose-jobs-under-sequestration/57066/

 

to mobile users: Drop dead

Cnet.com

by Larry Downes

July 30, 2012 11:43 AM PDT

WASHINGTON–The federal government is slinking away from a promise by President Obama to free up badly-needed radio spectrum for mobile users and the already over-taxed networks that serve them.

Just months after the publication of the National Broadband Plan in early 2010, the president issued a memorandum ordering the FCC and the Commerce Department’s National Telecommunications and Information Administration to “make available a total of 500 MHz of Federal and nonfederal spectrum over the next 10 years” for mobile users.

The goal was to clear unused or underutilized spectrum the FCC could then auction off for use by an exploding mobile ecosystem.

Yet two years later, despite frequent assurances that progress is being made to head off a “spectrum crisis,” almost no new frequencies have been offered for commercial use by either agency.

Now a report (PDF) published last week by the President’s Council of Advisors on Science and Technology urges the president to give up trying. PCAST, which describes itself as “an advisory group of the nation’s leading scientists and engineers” appointed by the president, recommends the White House issue a new memorandum that would dramatically reverse course, offering no new spectrum, now or ever, from government stockpiles.

Instead, PCAST is calling on the president to begin a long and technically-uncertain transition to a world in which usable radio frequencies would be shared by multiple users simultaneously. “The essential element of this new federal spectrum architecture is that the norm for spectrum use should be sharing, not exclusivity,” the report concludes.

Adoption of this radical new approach to spectrum management would utterly change a system that has been in place since the sinking of the Titanic, and which has in recent decades generated billions of dollars through license auctions. The new model would, if adopted, have a massive impact on technological development, requiring, according to the report, “a corresponding shift in the architecture of future radio systems that use it.”

As PCAST admits, implementation of large-scale shared spectrum would take years, add multiple layers of new federal bureaucracy, dramatically expand the role of the White House in on-going oversight of the airwaves, and require extensive new FCC regulations and enforcement for receiving devices such as smartphones.

At best, the PCAST plan is both technically and politically infeasible. At worst, it represents the latest step in a carefully choreographed campaign by federal authorities to shift attention away from their continued stonewalling on freeing up unused or underutilized spectrum.

Over the last two years, spectrum “sharing” has become code among federal authorities to stall for more time. The feds are playing a shell game, appearing to offer spectrum as ordered by the president but in fact offering nothing. And mobile users are already running out of both time and capacity.

A spectrum crisis looms
PCAST imagines a world in which government users would have primary access to spectrum currently assigned to federal applications, but where consumers could use those frequencies at times and in places where the government isn’t actively communicating. While PCAST’s vision of a shared “spectrum access system” is admirable, the kind of real-time spectrum reallocation it calls for is simply not cost-effective or in many cases not even possible given current technology.

The report itself acknowledges that PCAST’s approach “represents a major evolution of existing spectrum management practices,” and that “implementing it will not be easy and may take a long time.” Its most optimistic prediction, which seems hopelessly naive of the actual pace of government change even on minor policies, is that putting the plan into operation would take three to ten years.

That’s in part because there are no technologies in large-scale use, for example, that would allow mobile devices such as smartphones to dynamically change frequencies when users with higher priority need access, especially when the secondary user is in constant motion. According to Mary Brown, Cisco’s director of government affairs, “no one in the world has ever tried to have a mobile technology share with some kind of federal system.”

Meanwhile, “mobile networks in major cities are running close to capacity during peak periods already.” That’s according to Richard Bennett, a senior fellow with Information Technology and Innovation Foundation, a D.C.-based think tank. Barely a blip a few years ago, mobile broadband is growing at an astronomical pace. According to a report earlier this year (PDF) from the White House Council of Economic Advisers, mobile data traffic will increase another twenty-fold between 2010 and 2015.

Some of that capacity may come from more efficient use of existing network infrastructure and by increasing tower deployments and making use of smaller cell sites. “But much of it depends on more spectrum,” Bennett said.

The FCC agrees, emphasizing its longstanding commitment to clearing and reallocating underutilized frequencies and auctioning them to those who would put them to higher and better uses. In response to the PCAST report, FCC Chairman Julius Genachowski told the group (PDF) that while spectrum sharing was a promising tool to improve the system, “Historically, our basic strategy has been to clear spectrum and reallocate it. This is a strategy that has delivered tremendous benefits for America.”

Recently-confirmed Commissioners Jessica Rosenworcel and Ajit Pai have both called for more investment in sharing technologies, and quickly. According to Rosenworcel (PDF), “We must put American know-how to work and create incentives to invest in technologies–geographic, temporal, and cognitive–that multiply the capacity of our airwaves.”

Regardless of the solution, time is of the essence. The 2010 NBP now seems conservative in its estimate that the insatiable demand of consumers for mobile broadband applications would quickly overtax existing network capacity.

According to the plan, 300 MHz of added spectrum would be required by 2015, and 500 Mhz by 2010. But different bands of spectrum have different properties, and not all bands are useful for mobile applications. Frequencies that are usable for mobile services are nearly all licensed for other uses. That means the only way to get more spectrum is to reallocate it from existing licensees.

The NBP identified two particularly promising candidates for reallocation. One is over-the-air television broadcast, in a steep decline for decades. Today, less than 20 percent of U.S. homes rely on broadcast, receiving programming instead from cable, satellite, and traditional telephone service providers. Increasingly, video is migrating to “over the top” Internet services, including Netflix, YouTube, and Hulu.

Earlier this year, Congress authorized the FCC to work with broadcasters who don’t need some or even all of their existing allocations to conduct a new round of auctions that would license that spectrum to mobile broadband providers, generating an estimated $30 billion in auction revenue(PDF) to be shared by the broadcasters and the federal government.

Assuming the broadcasters cooperate, these auctions could go far toward closing the spectrum deficit identified by the NBP. But given the complexity of corralling the TV stations, these auctions won’t result in new capacity for mobile users for several years at least — long after the FCC predicts mobile networks will hit the wall.

The federal promise revoked
That’s why so much hope is riding on the second potential source of reallocation — the federal government itself. Today, federal agencies are the largest holders of spectrum, much of it in bands that are extremely useful for mobile services. And a great deal of that spectrum lies fallow, or is used only occasionally, often for systems that no longer require dedicated government use.

That was the view of the White House, in any case, in 2010, when President Obama ordered federal agencies to identify as much as 500 Mhz of federally-assigned frequencies that could be auctioned to mobile providers. The 2010 Memorandum ordered the NTIA to coordinate the federal efforts to release spectrum. (While the FCC manages spectrum licensed to commercial users, NTIA oversees spectrum assigned to the federal government.)

In March of this year, NTIA finally published its first detailed proposal for clearing unused or underutilized spectrum held by federal agencies.

The NTIA report began optimistically, noting that twenty federal agencies had identified almost 100 Mhz of contiguous, usable frequencies that could be cleared over the next 10 years. Federal users who still need the spectrum they have would be relocated to other frequencies that have less value to commercial applications.

But reading between the lines, the report noted agency estimates of relocation costs exceeded $18 billion, more than the amount likely to be raised by FCC auctions. Under current law, the FCC cannot hold auctions for spectrum currently licensed to government users unless it can raise more money than the costs of relocation. So the frequencies won’t be cleared after all.

The lack of any details or analysis by NTIA, and the fact that the estimates were developed solely by the agencies themselves, led many here in Washington to suspect that bureaucrats were sandbagging, developing estimates in a way that ensured they would not have to relocate.

In no cases did the agencies identify systems that were or were likely to become obsolete in the near or long term, or which could be modified to require less spectrum. For example, multiple agencies operate separate video surveillance networks, many of them using outdated analog technologies. The Departments of Defense and Justice estimated that relocating their video surveillance, which by itself consumes 130 Mhz of spectrum, would cost over $5 billion.

Nothing in the NTIA report suggested that the less-sensitive of these systems could be migrated to commercial providers, or updated to digital technology. The latter solution, according to ITIF’s Bennett, could free up as much as 75 percent of the currently-assigned spectrum.

Against this obvious stalling, the NTIA offered the consolation prize of sharing the spectrum between federal and commercial users during the transition, or in some cases on a permanent basis.

The proposed sharing arrangements, however, evaporated on closer reading. The agencies demanded, for example, that any commercial use be made secondary to unidentified federal requirements, with “clear regulatory mechanisms” to enforce their priority. According to the report, sharing would also require “acknowledgement by industry of its status with respect to potential interference from federal operations.” Federal users, in other words, would be free to define the limits of acceptable interference, including in ways that make cost-effective sharing impossible.

Realistic limits of sharing today
It is true that some sharing between government and commercial users may be possible in the short term. A number of defense systems, for example, use their frequencies to communicate with satellite systems on a regular schedule — perhaps only once a day or even once a month. So long as commercial networks stayed off those frequencies at specific times, these otherwise unused frequencies could be put into commercial use without relocating the federal systems.

In other cases, federal use is limited to certain locations, and commercial networks operating outside those areas could make full use of the frequencies without interfering.

This kind of geolocation sharing, as it is known, is already in use in many parts of the spectrum, and the FCC is now testing expanded forms of sharing in the white spaces between television channels.

The key to success for geolocation sharing is the creation and accurate maintenance of databases that specify where and when the primary user is operating. While already a significant obstacle for commercial sharing, as in the white spaces, many federal systems are classified. It is not clear if federal users will be able to provide the necessary information to make the databases usable.

Reading between the lines of the NTIA report, PCAST correctly concluded that it was unlikely the federal government would actually cough up any significant amount of spectrum for mobile users in the timetable ordered by the White House. But instead of calling out the agencies for obvious dawdling, PCAST doubled down, proposing an even more radical reliance on sharing technologies that are either untested or, in many cases, non-existent.

The more promising form of sharing emphasized in the PCAST report rely not just on databases but on intelligence built into transmitting and receiving devices. These include the use of “cognitive radios” that can sense when activity is already present on a given frequency, and “dynamic spectrum access” which can rapidly switch between frequencies as interference reaches beyond pre-determined thresholds.

The report hopes such technologies “will likely mature and become viable during the implementation of these recommendations.” If so, their use could “multiply the effective capacity of spectrum by a factor of 1,000,” which the authors argue is a “conservative estimate.” But the report, here and elsewhere, offers no concrete data to support its wild claims. Nor could it.

While PCAST is to be commended for its vision of future technologies and their ability to make dramatic improvements in the efficiency of spectrum allocation, use, and minimization of interference, the kinds of technologies PCAST foresees simply don’t exist yet. We don’t know when they will come, and how well they will work. There is far too much uncertainty in technologies that “will likely mature” in the next decade to justify a complete rewrite of spectrum policy.

A call for presidential leadership
PCAST admits that their proposal “represents a major evolution of existing spectrum management practice, and that “implementing it will not be easy and may take a long time.” In the best case scenario, all the authors can hope for is that “the long term direction outlined in this report can start to be operational in one to three years,” with actual implementation of the new sharing model taking three to ten years.

Even that estimate requires wishing away the practical realities of reorienting the federal government to scrap a spectrum management system that has been in place for 100 years. Given Washington’s profound inertia, even minor changes in policy can take years to implement.

Even if it were technically feasible, the certain delays in implementing this policy would stretch the PCAST time frame out by years if not decades. That’s time we don’t have. Mobile users will take little comfort from the grand ambitions of the president’s advisors when their mobile devices can’t perform even basic functions because the limited spectrum available to them has become saturated.

President Obama was quick to respond to the National Broadband Plan’s conservative predictions of a mobile doomsday. He ordered federal agencies to start immediately the process of releasing spectrum that could be put to far more valuable use by mobile consumers. But two years later, nothing has happened. Government bureaucrats and their supporters have brilliantly thwarted any real prospect for federal spectrum reform at every turn.

Meanwhile, PCAST and the NTIA are fiddling while mobile Rome burns. And the White House is simply watching from the sidelines.

For the sake of our vibrant mobile ecosystem — one of the few bright spots in our languishing economy — we need real leadership from the White House. We need a practical plan for retiring outdated federal systems, and honest estimates of the costs of relocating those that are still necessary. We need sharing arrangements that can work with today’s technology. And we need it now.

What we don’t need are more two hundred-page reports touting unrealistic solutions, new regulations, and still more layers of bureaucracy. The president’s 2010 memorandum promised to “unleash the wireless broadband revolution” by freeing up urgently needed spectrum for mobile users. But instead of innovation in spectrum management, all we’ve seen so far is the traditional genius of the federal civil service for resisting change. Mr. President, it’s well past time to kick some butt.

 

Cnn

 

Drought strains U.S. oil production

By Steve Hargreaves @CNNMoney July 31, 2012: 4:55 AM ET

NEW YORK (CNNMoney) — One of the worst droughts in U.S. history is hampering oil production, pitting farmers against oilmen and highlighting just how dependent on water modern U.S. energy development has become.

Over 60% of the nation is in some form of drought. Areas affected include West Texas, North Dakota, Kansas, Colorado and Pennsylvania, all of which are part of the recent boom in North American energy production.

That boom is possible partly by hydraulic fracturing. Known as fracking for short, the controversial practice gets oil and natural gas to flow by cracking shale rock with sand, chemicals, pressure and water.

Lots of water. Each shale well takes between two and 12 million gallons of water to frack. That’s 18 Olympic-sized swimming pools worth of water per well.

“We’re having difficulty acquiring water,” said Chris Faulkner, CEO of Breitling Oil and Gas, an oil company with operations in many of the new shale regions including Bakken in North Dakota and Marcellus in Pennsylvania.

Faulkner said officials in two Pennsylvania counties have stopped issuing permits for oil companies to draw water from rivers, forcing them to go further afield to obtain the crucial resource.

In Kansas, he said much of the industry’s water comes from wells owned by farmers. Farmers used to sell him water for 35 cents a barrel. Now, he said, they are turning down offers of 75 cents or more.

As a result, between 10% and 12% of the wells Breitling planned on fracking have been put on hold.

“As the drought continues, those numbers will rise,” said Faulkner.

Similar problems are happening at companies industry-wide, said Neal Dingmann, an analyst SunTrust Robinson Humphrey in Houston who covers many of the smaller and mid-sized companies that operate in the new shale plays.

Dingmann said he expects to see maybe a 5% reduction in new wells by the companies he covers.

Those numbers aren’t expected to have a meaningful impact on oil or gasoline prices.

 

Oil from shale rock is just a small portion of overall U.S. oil production, which in turn contributes just a fraction to global oil supplies, the main determinant in prices.

Plus, the drought is not expected to continue forever.

But shale oil is playing an important part in new U.S. supply growth, and the drought illustrates how vulnerable that production is to disruptions in the availability of water.

Related: Get ready to pay more for your steak

Other segments of the energy chain are also being hit by the drought.

Much of the country’s natural gas is produced by fracking. Unlike oil, natural gas can’t be easily shipped around the world, so its price is more closely tied to local conditions.

Natural gas prices have surged some 70% over the last couple of months. The runup is mostly due to increased demand for air conditioning during the heat wave and a switch to natural gas from coal by many utilities. But at least one analyst puts part of the blame on drought-induced production problems.

“Another rally in natural gas as drought concerns may lead to a cessation of non-conventional shale production,” Stephen Schork, an energy trader and publisher of the industry newsletter the Schork Report, wrote in a note last week.

Corn-based ethanol prices have jumped roughly 30% since the start of June, in step with corn prices that have reached record highs. Ethanol makes up about 10% of a gallon of gas in most parts of the country.

The recent rise in gasoline prices has more to do with rising oil prices, which are being driven by the standoff with Iran and hopes for a looser monetary policy rather than drought conditions the United States. But the higher ethanol prices are probably playing a small part, said Brian Milne, refined fuels editor at the information provider DTN.

Another water-dependent link in the nation’s energy supply chain is transport — specifically, barges on the nation’s canals and rivers.

Andrew Lebow, an broker at Jefferies Bache in New York, said people are concerned that some energy terminals will have a hard time getting supplies if low water levels make routes impassable.

“I don’t think the impact will be widespread,” said Lebow. “But you could see prices rise in some areas.”

 

 

Sense-And-Avoid System To Transition To Global Hawk

By Graham Warwick

Source: Aerospace Daily & Defense Report

July 30, 2012

 

An airborne sense-and-avoid (ABSAA) system enabling the Northrop Grumman RQ-4 Global Hawk to fly in civil airspace will transition to full-scale development at the end of September after completion of a final round of research test flights.

The multi-sensor system has been in development at the U.S. Air Force Research Laboratory (AFRL) since 2008 under the Multiple Intruder Autonomous Avoidance (MIAA) science and technology program.

Flights from Niagara Falls, N.Y., in August/September will mark the final phase of tests using a Calspan-operated Learjet equipped with the MIAA sensors and algorithms and acting as a surrogate for the unmanned Global Hawk.

The flights will evaluate collision-avoidance algorithms and a new electronically scanned sense-and-avoid radar, as well as a new technique to perform passive target ranging from the two-dimensional imagery provided by electro-optical sensors.

“When the flight tests conclude, MIAA will transition to EMD [engineering and manufacturing development] for the Global Hawk program,” says Capt. Matt Schneider, AFRL program manager for sense-and-avoid technology.

A system based on MIAA technology is expected to fly on the Global Hawk in 2015, with initial operating capability planned for 2017, says Paul Schaeffer, airborne sense-and-avoid program manager for the Global Hawk program office.

The Air Force is leading development, but the Navy and Army are both partners in the ABSAA effort. “The Navy’s desire is to put the system on their aircraft [the MQ-4C Triton], while the Army’s interest is in controls and displays commonality with its ground-based sense-and-avoid system,” Schaeffer says.

The goal of MIAA is to enable unmanned aircraft to autonomously detect and avoid both cooperative and noncooperative intruders, responding within minutes to longer-range threats to maintain safe separation, and within seconds to short-range threats to avoid collisions.

The system uses electro-optical sensors and radar to detect and track noncooperative targets such as general-aviation aircraft without transponders. The traffic collision avoidance system and ADS-B (automatic dependent surveillance

 

 

Experts: Cloud Brings Vulnerabilities

DefenseNews

Jul. 30, 2012 – 08:51AM |

By ZACHARY FRYER-BIGGS |

The cloud, a generic term for a wide range of hosted data systems, is hot.

Rarely does a shift in information technology, a realm largely left to technical experts working behind the scenes, become the subject of congressional debates, legislation and speeches from high-profile Pentagon officials.

The allure of the cloud, an architecture that could cut costs, improve security and allow greater data integration, all in an environment of fiscal austerity, is strong.

But experts say that in the rush to realize those potential advantages, insufficient thought is being given to the security risks.

“I think there’s such a rush to the cloud, that we’re overlooking a lot of details,” said Jeff Moulton, a researcher at the Georgia Tech Research Institute. “The devil’s in the details.”

Some of the very assumptions that have been the building blocks of cloud advocacy may be incorrect, most notably the assertion that the cloud will improve security.

 

Yet, even with the issues facing the basic structure of cloud computing, what scares experts most is the recent push toward the use of commercial cloud providers, a cost-effective move that gives the Defense Department even less control over how its data is handled.

“You’re never going to get the visibility you need negotiated into a contract somehow, because the commercial provider would never want the customer to have that level of scrutiny into their internal operations,” said Richard Bejtlich, chief security officer at the cyber incident response firm Mandiant. “It would just be way too disruptive.”

Pentagon spokesmen did not respond to multiple requests for comment on the experts’ concerns.

The argument in favor of private cloud systems rests on three assertions about how the architecture could improve DoD systems:

• The cloud is more secure than less consolidated data systems.

• The cloud will require fewer talented cyber experts to protect.

• The cloud can save the department large sums of money through fewer hardware requirements and more efficient operation.

The third argument, that the cloud would save money, is widely recognized and accepted by experts, although the magnitude is disputed. The other two, however, are the subject of heated debate.

“There are specific vulnerabilities associated with cloud architecture that, as far as I can tell, have not been fully and adequately addressed,” said Moulton, who previously served in the U.S. Air Force doing special operations communications.

The simplest and most frequently cited argument against the assertion that the cloud is more secure is the risk of centralization. DoD networks are still largely fragmented, which can make information sharing difficult. But that fragmentation means no individual breach would compromise the larger data mass.

“When there is no centralized control of all those systems, there is no central place to [get] access to everything else,” Bejtlich said. “Is it better to have everyone decide how to deploy their systems independently, or is it better to have one super-image that we believe contains the best security posture?

 

“With the former, the bad guy who gets onto the system or is trying to get onto the system doesn’t necessarily know what the victim is running. With the latter, he knows exactly what they’re running, and he can tailor his research efforts to that.”

The complaint about the fragmented approach has been that maintaining decent security at each individual outpost was both expensive and difficult. By consolidating systems, DoD could be more confident that its systems are properly designed.

But with cloud architecture, even if the protection is better, once an attacker is in, the loss is much worse.

“You’re putting all of your eggs in the same basket,” Moulton said.

Because of the added risk, the exterior defenses and network monitoring need to be even better to guard a more valuable system, probably meaning as many experts as are employed across networks now, Moulton said. And because of the lack of expertise in cloud architecture, building and protecting cloud systems could be far more expensive than has been predicted, he said.

“There’s the rush to this, and everyone thinks they’re going to save so much money and manpower,” he said. “I don’t agree with that broad assumption.”

 

Still, some of the assumptions that the experts questioned have gained enough traction to appear on Capitol Hill as well, with the added complication of including the commercial marketplace.

The 2012 National Defense Authorization Act included instructions that DoD develop a strategy that would allow the “migration of defense data and government-provided services from department-owned and operated data centers to cloud computing services generally available within the private sector that provide a better capability at a lower cost with the same or greater degree of security,” a recent DoD report noted.

The use of private clouds, controlled by DoD, faces largely technical problems surrounding the implementation of effective security measures. Commercial cloud offerings face the added issue of trust.

“You have to have a ton of trust because you’re basically turning over your crown jewels to a service provider that may not be in business next week,” Moulton said.

To make matters worse, the security requirements for commercial cloud providers have been “extremely watered down,” said Tom Conway, director of federal business development at McAfee.

McAfee has also found that attacks on commercial cloud systems are growing, particularly in the area of criminal activity surrounding banking. In a recently released report about an effort the company called Operation High Roller, McAfee found evidence of increasingly sophisticated efforts to access banking data stored remotely.

Attacks such as those found by Operation High Roller are an indication of things to come, Conway said.

“Everything that happens in commercial eventually seems to make its way into the government space,” he said.

 

Attacks on the cloud are following a similar arc as those targeting Apple computer systems. For decades, enthusiasts insisted that Apple products were immune to viruses. Many users failed to use adequate security measures based on this assumption.

In reality, experts say, the systems weren’t attacked as frequently not because the systems were impervious but because the percentage of computer users who owned Apple systems was very small. As a simple economic argument, designing a bug for a Mac wasn’t as cost-effective as designing one for the much larger PC market. But as Apple computers grew in popularity, the economics shifted.

In recent years, a slew of attacks have targeted Apple products, including the Flashback Trojan that infected more than half a million systems. The lack of security awareness among users likely makes the rise in attacks even more dangerous.

For the same reasons, experts said, few organized attacks have been noted against cloud systems. The logic remains that, while cloud implementation remains small and important data remains locally stored, the value proposition is not correct for attacking the cloud. The lack of previous attacks is likely creating a false sense of security, they said. But with the movement of bank data to cloud systems, McAfee’s research shows that attackers will adapt.

And not only will attacks likely make their way into government space, the use of commercial providers by DoD is also likely to catch on. In its cloud computing strategy released in July, DoD Chief Information Officer Teresa Takai cited commercial intentions as a major component of updated strategy.

“The DoD Cloud Computing Strategy has been expanded to address the use of commercial cloud services in the department’s multi-provider enterprise cloud environment,” she wrote in a letter included in the report.

 

The report itself emphasizes the use of commercial providers.

“The department will leverage commercially offered cloud services that offer the same or a greater level of protection necessary for DoD mission and information assets,” it said.

While the move to the cloud may cause problems, the in-between steps where a hybrid of cloud systems and local systems are used could be even worse, Conway said.

“I’m worried about the interim,” he said. “Right now, as bad as it is, at least it’s sort of understood how bad it is. We’re not really sure what cloud is going to be, and in between, we’re going to have a hybrid environment for an extended period of time where we’re going to the worst of both worlds.”

 

Microsoft 10-K confirms Surface will ship on Oct. 26

Company stays mum on pricing plans for Microsoft’s first tablet, which will run Windows 8 or Windows RT

Computerworld

By Matt Hamblen

July 30, 2012 10:22 AM ET

 

Computerworld – Microsoft Surface tablets will go on sale on Oct. 26, the same day that Windows 8 becomes available, the company disclosed in a 10-K report filed on July 26 with the Securities and Exchange Commission.

Pricing for the Surface tablets, which run Windows 8 or Windows RT, wasn’t disclosed.

Microsoft didn’t respond to a request for comment on the 10-K report or its tablet pricing plans on Monday.

In the 10K filing, Microsoft said, “The next version of our operating system, Windows 8, will be generally available on October 26, 2012. At that time, we will begin selling the Surface, a series of Microsoft-designed and manufactured hardware devices.”

Microsoft had previously announced via a blog post that Oct. 26 would be the launch date for Windows 8. The brief blog post, however, didn’t specifically refer to the company’s shipping plan for the Surface tablets.

In June, when Microsoft showed the 10.6-in. Surface tablets for the first time at a Hollywood press event, company officials said that ARM-based versions of Surface for Windows RT would come out first — at about the time of the general release of Windows 8.

The Surface Windows 8 Pro tablet, which runs an Intel chip, would launch about three months after the ARM tablets Microsoft said during the unveiling.

At the time, Microsoft said the Windows RT-based Surface tablets would be priced about the same as comparable tablets, while the Surface Windows 8 Pro would be priced about the same as ultrabooks, Microsoft said.

Microsoft on Monday separately announced new keyboards and mice that are optimized for the Windows 8 OS.

In the 10-K filing, Microsoft also acknowledged that it risks alienating its computer manufacturing partners with the launch of its Surface devices.

 

Windows 8 Security Garners Praise at Black Hat

Eweek.com

By: Sean M. Kerner

2012-07-26

For years, hackers have had an easy time exploiting Windows heap memory manager. However, all that is going to change with Windows 8, according to new research released at Black Hat.

LAS VEGAS — Microsoft’s Windows 8 is not yet generally available—the operating system officially lands on store shelves Oct. 26—but that’s not stopping security researchers from trying to find flaws in the OS. However, hackers who have had an easy time with Windows might find some new security features hard to beat.

For years, Windows has come under frequent attacks, thanks to hackers exploiting the operating system’s heap memory manager. However, in a new report released at the Black Hat conference here, engineers at Microsoft have done an admirable job of defending memory, making it more difficult than ever before for attackers to exploit.

The report is called “Windows 8 Heap Internals.”

Chris Valasek, a senior security researcher at Coverity and co-author of the research, explained the memory heap is a critical component of Windows. The memory manager is what tells applications that memory is or isn’t available for use.

“As a security researcher, you want to look into how the memory is managed and see if there is the potential for a buffer overflow or some kind of exploit that could alter what the heap memory manager is supposed to do,” said Valasek.

Heap memory exploitation attacks have been around for over a decade, and Valasek wanted to see what had changed in Windows 8. As it turns out, Valasek and his research partner Tarjei Mandt were not able to find as many deficiencies in Windows 8 memory. The two researchers believe that Microsoft has made a giant leap forward in heap memory security with Windows 8.

“While they have really stopped all the present exploitation techniques out there, they did have to introduce new code and data structures,” said Valasek. “That new code, under certain conditions is susceptible to attack, as previous versions of Windows were.”

Those conditions assume that an attacker has access to a software vulnerability that leads to some kind of writing outside of the bounds of the piece of memory that application is supposed to use.

“The deficiencies aren’t quite corner cases, but they rely on a lengthy set of preconditions to occur,” said Valasek. “That being said, things like this aren’t impossible; it just so happens that you need to have a certain few things in place for your overflow for an attack to work.”

In Windows 8, Microsoft engineers added a number of new prerequisites to the heap memory manager, making it more difficult to exploit.

“In previous versions of Windows you could say, allocate a bunch of memory, get a buffer overflow and the basic attack technique would work,” said Valasek.

In contrast with Windows 8, Valasek said you would have to allocate some memory, but not too much, and then ensure that a few certain things don’t occur and then run the overflow.

“Windows 8 definitely raises the bar and makes things harder to exploit,” said Valasek. “At the same time, it’s not entirely impossible.”

In previous versions of Windows, Microsoft attempted to mitigate memory overflows with Data Execution Prevention (DEP) and Address Space Layout Randomization (ASLR). While those techniques are helpful, Windows 8 goes a step further.

“The nice thing about the Windows 8 heap mitigations is they don’t have to be enabled by the developer and they are on by default,” said Valasek. “With ASLR and DEP, you have to compile your code to make sure they are enabled for your application.”

Valasek and Mandt looked at preview versions of Windows 8 for their research. Valasek said it’s possible that additional mitigation could make Windows 8 heap memory even more secure.

“I’ve written a lot of heap exploits in my day and I wouldn’t want to be tasked with writing one for Windows 8 right now as there are a lot of hurdles you have to cross,” said Valasek. “It will take people a lot of time with a lot of skill to exploit.”

 

Microsoft’s new Outlook mail is a welcome Hotmail replacement

MSN.com

Wilson Rothman

If you’re one of the millions of people who are embarrassed to admit that you still have a Hotmail address, your day has come. Seriously, you can migrate to Microsoft’s impressive new Outlook email service, and start using its smart services to merge your contacts with Facebook and clean out all the newsletter spam that has built up over the last 1,000 years.

OK, OK, yes, you’re saying … “Outlook? But I already have Outlook. It’s an app. On my computer.” Well, you can blame Microsoft for confusing things by using the name of their popular email and calendar client as the name of their new service that offers … email and calendar. But don’t let it stress you out. At least you can put your Outlook email service into your Outlook app!

As a Gmail, Hotmail and erstwhile Yahoo mail user, I am sufficiently impressed by what Microsoft has cooked up here, a mail service that will soon mercifully replace the aging Hotmail as the default webmail for Microsoft’s Live service. It’s called Outlook, and that means you can get an @outlook.com address (today, in fact). You can keep your old Hotmail address too, and even add other aliases — completely separate email addresses that route to certain folders — to hand out to retailers and other would-be spammers.

Some of the top features are available in the current Hotmail and in comparable form on Gmail and other emails, but what makes this whole thing so nice is its interface, and how so much of the smart stuff happens automatically, or at least semi-automatically.


Wilson Rothman / NBC News

Mail is automatically categorized, but you can add other incoming notes to categories, too.

Right after you migrate your Hotmail account to Outlook, all those machine-generated newsletters, shipping updates and social updates get readily categorized, for your filing or deletion. Anything with attached documents or attached photos is also called out in a special “quick view” category.

You can very easily apply rules by individual address, a selection of them, or a whole folder or “quick view” category. The main filters, under the Sweep menu, are:

  • Move all from… – Puts every received email from sender into a designated folder, and offers to do the same with future emails.
  • Delete all from… – Deletes every received email from sender, and offers to do the same with future emails.
  • Schedule cleanup – Lets you choose how long you want emails from a given sender to sit in your inbox before they’re automatically moved or deleted. (As someone who is bad at cleaning out the inbox, I love this option!)


    Wilson Rothman / NBC News

    Scheduling clean ups is the best way to deal with all that semi-relevant spam, like weekly notes from your favorite retailers.

    It may come as no surprise to people who have been watching the development of Windows 8 and Windows Phone that your former Hotmail contacts, along with contacts from Facebook, Twitter, LinkedIn, even Gmail, can be merged into a big ole database labeled People. This is the same People database that is accessible throughout Microsoft’s products, so users of Windows Phone, for instance, are basically already set up for Outlook mail — they just have to log in.


    Wilson Rothman / NBC News

    In the People contacts page, you can write on people’s Facebook walls, tweet at them, and even send direct messages.

    The social integration doesn’t stop there. You can quickly jump on a live chat with someone using Facebook (alternatively, of course, you can use Windows Live Messenger). And when you click on an email from somebody you are Facebook friends with, or who you follow on Twitter, their status appears to the right of the note. And if you’re not a friend or follower of them? You can quickly remedy that shortcoming with a click.

    The interface itself is clean and butter smooth, with very nice contextual menus that pop up when you need them, one-click read/unread buttons that show up on rollover, drag-and-drop contact info and more. I have yet to see an ad, but when they do appear, they will be (for the time being) shoved off to the far right, in a gutter, which alternately becomes the social update display when you are viewing emails of friends, and a messaging window when you’re actively chatting with someone.

    Microsoft tells me that the data mined by the Outlook mail service won’t go as deep as others, so while ads served will be contextual, they will only be based on the subjects and (name brand) senders of email, rather than on the actual contents of email, which may get more personal.

    At a technical level, smartphone power users may be delighted to know that this new email service is based on Exchange Active Sync which means that when you set it up as a mail account on your iPhone, BlackBerry or other device, you can get push updates, and as more features roll out, treat it as a bonafide Exchange account (but one that’s free).


    Wilson Rothman / NBC News

    The new SkyDrive interface is well worth a look all by itself, especially since SkyDrive offers integration across many operating systems.

    The Outlook mail service is the heart of a major revamp of all of the Live services, and when you sign up for a new account or migrate your Hotmail account to it (by clicking on the Options button), you will see a new People contacts manager and a fresh new SkyDrive interface as well. The Calendar portion of the service is not yet updated, so don’t get too mad when it feels a bit crusty by comparison.

    Yes, Outlook mail is a work in progress. I’ve experienced lots of hiccups, particularly surrounding the integration with Facebook and Twitter. It is, after all, in beta and likely to be there for a while. Once Microsoft gets the kinks ironed out, though, and gets a healthy population using it, the plan is to phase out the painfully antiquated Hotmail for good.

    As someone who only touches his Hotmail with the proverbial 10-foot pole, but who has spent an unnatural amount of time over the past few days inside Outlook mail, running smart email filters, watching with glee as a ridiculous amount of old mail gets flushed away, I welcome the change, and applaud Microsoft for doing some real Web innovation here.

    Where do you get this new email? Just head over to outlook.com, or log into your Hotmail and click on the Options button.

    Wilson Rothman is the Technology & Science section editor at NBCNews.com. Catch up with him on Twitter at @wjrothman, and join our conversation on Facebook.

     

    ..

     

    F-22 Fighter Loses $79 Billion Advantage in Dogfights: Report

    By Lee Ferran | ABC News

    July 30, 2012

     

    The United States has spent nearly $80 billion to develop the most advanced stealth fighter jet in history, the F-22 Raptor, but the Air Force recently found out firsthand that while the planes own the skies at modern long-range air combat, it is “evenly matched” with cheaper, foreign jets when it comes to old-school dogfighting.

    The F-22 made its debut at the international Red Flag Alaska training exercise this June where the planes “cleared the skies of simulated enemy forces and provided security for Australian, German, Japanese, Polish and [NATO] aircraft,” according to an after-action public report by the Air Force. The F-22 took part in the exercise while under strict flying restrictions imposed by Defense Secretary Leon Panetta in light of mysterious, potentially deadly oxygen problems with the planes – problems that the Pentagon believes it has since solved.

    The Air Force said the planes flew 80 missions during the event “with a very high mission success rate.” However, a new report from Combat Aircraft Monthly revealed that in a handful of missions designed to test the F-22 in a very specific situation – close-range, one-on-one combat – the jet appeared to lose its pricey advantages over a friendly rival, the Eurofighter Typhoon, flown in this case by German airmen.

    “We expected to perform less with the Eurofighter but we didn’t,” German air officer Marc Grune said, according to Combat Aircraft Monthly. “We were evenly matched. They didn’t expect us to turn so aggressively.”

    Two other German officers, Col. Andreas Pfeiffer and Maj. Marco Gumbrecht, noted in the same report that the F-22’s capabilities are “overwhelming” when it comes to modern, long-range combat as the stealth fighter is designed to engage multiple enemies well-beyond the pilot’s natural field of vision – mostly while the F-22 is still out of the other plane’s range. Grumbrecht said that even if his planes did everything right, they weren’t able to get within 20 miles of the next-generation jets before being targeted.

    “But as soon as you get to the merge…” Pfeiffer said, referring to the point at which fighters engage in close-up dog fighting, “in that area, at least, the Typhoon doesn’t necessarily have to fear the F-22 in all aspects… In the dogfight the Eurofighter is at least as capable as the F-22, with advantages in some aspects.”

    In response to the report, a spokesperson for the Air Force, Lt. Col. Tadd Sholtis, told ABC News that one-on-one combat is only one way to evaluate an aircraft’s capabilities and said it’s not “necessarily the most relevant to every scenario.”

    “The F-22 is conceived and employed as part of an integrated force that provides offensive capabilities that make close engagements far less likely while retaining the ability to handle close engagements in tandem with other fighters,” he said.

    Air Force Gen. John Jumper, one of the few airmen to have flown both aircraft, said in 2005 that it is difficult to compare the F-22 and the Eurofighter.

    “They are different kinds of airplanes to start with,” he said, according to an Air Force Print News report. “It’s like asking us to compare a NASCAR car with a Formula 1 car. They are both exciting in different ways, but they are designed for different levels of performance.”

    The F-22 “can maneuver with the best of them if it has to, but what you want to be able to do is get into contested airspace no matter where it is,” Jumper said, referring to the F-22’s stealth and supercruise capabilities that are meant to allow the plane to sneak in to hostile territory undetected – an ability the non-stealth Eurofighter lacks.

    As for where that contested airspace may be, the Air Force hasn’t said. But in April 2011 an executive for Lockheed Martin, the primary manufacturer of the F-22, told ABC News that the plane could “absolutely” find a home in quick strike missions against countries like Iran or North Korea. Over the weekend, the Air Force deployed a squadron of F-22s to Kadena Air Base in southern Japan just over 800 miles south of the North Korean border – a move that comes three months after an undisclosed number of the stealth jets were deployed to an allied base in the United Arab Emirates, some 200 miles from the Iranian mainland.

    The F-22 is the single most expensive fighter jet in history at a total acquisition cost of an estimated $79 billion for 187 planes, meaning each plane costs approximately $420 million. Estimates for the Eurofighter Typhoon – the premier fighter for several allied countries including the U.K., Germany and Italy – put that plane at just under $200 million each, according to an April 2011 report by England’s Public Accounts Committee.

    “[Red Flag was] a mission to get to know each other, the first contact by German Eurofighters in the continental U.S.,” Grune said of mock-fighting the F-22s. “We are not planning on facing each other in combat. We want to work together but it was a starter for us to work together. They were impressed, as we were impressed by them.”

    This story appeared on Network World at

    http://www.networkworld.com/news/2012/080112-google-amazon-cloud-261330.html

     

    Google cloud vs. Amazon cloud: How they stack up

    Google recently got in the IaaS cloud market to go up against Amazon, but researcher says the battle is in its early days

    By Brandon Butler, Network World

    August 01, 2012 02:44 PM ET

     

    Google’s new IaaS cloud boasts strong compute performance but lacks the breadth of features in Amazon Web Services’ 4-year-old Elastic Compute Cloud, according to one industry analyst’s side-by-side comparison of the services.

    Neither company provides details of the silicon chips within its servers, but analyst Chris Gaun from Ideas International (recently acquired by Gartner) has used information in public statements to determine the hardware behind each vendor’s cloud. Google has said it uses Intel Sandy Bridge processors and that each unit of its Compute Engine delivers performance matching that of at least a 1.0- to 1.2-GHz 2007 Opteron chip. Other media have reported that Google uses 2.6-GHz processors, which leads Gaun to believe the company has Xeon E5-2670 chips, the only ones on the market at the time of Google’s announcement that deliver that level of raw compute power.

    Google’s new IaaS cloud boasts strong compute performance but lacks the breadth of features in Amazon Web Services’ 4-year-old Elastic Compute Cloud, according to one industry analyst’s side-by-side comparison of the services.

     

    Neither company provides details of the silicon chips within its servers, but analyst Chris Gaun from Ideas International (recently acquired by Gartner) has used information in public statements to determine the hardware behind each vendor’s cloud. Google has said it uses Intel Sandy Bridge processors and that each unit of its Compute Engine delivers performance matching that of at least a 1.0- to 1.2-GHz 2007 Opteron chip. Other media have reported that Google uses 2.6-GHz processors, which leads Gaun to believe the company has Xeon E5-2670 chips, the only ones on the market at the time of Google’s announcement that deliver that level of raw compute power.

    Gaun believes Google is running the high-capacity chip across its cloud infrastructure, while Amazon makes it available in certain instance types for Elastic Compute Cloud customers, including in its recently announced high I/O extra large cluster compute offering. “Google seems to be running only the latest and greatest chips on the market, while Amazon has a wide variety of chips for customers to use,” Gaun says.

    Amazon isn’t standing pat either. AWS on Wednesday, for example, announced the ability to set the input/output operations per second (IOPS) in Elastic Block Storage.

    There are other differences between Google Compute Engine, which is still in limited preview mode, and Amazon cloud services. AWS has 11 different sizes of compute instances, ranging from small virtual machines with 1.7GB of memory, to extra-large compute clusters with 60.5GB of memory, whereas Google has only four. Google also makes the fiber-optic links between its own data centers available to cloud customers. AWS has a variety of accommodating features in its cloud though, such as the EBS volumes, relational database services, load balancers and others.

    The two companies are appealing to different customers, Gaun says. While AWS is targeting technology-reliant businesses that are turning to the cloud to host their websites, databases and storage, Google is focused initially on research and development teams that may have a need for high-performance computing to complete a project, for example. The strategy is seen in the pricing models: AWS offers reserved instance pricing discounts, in which customers agree to use a compute instance for months or even years. Google’s cloud is priced by smaller time chunks and therefore aimed at shorter-lived projects.

    Gaun says if Google wants to compete in a broader market with Amazon, it will likely have to offer a discounted pricing option for long-term use. That may come in time, Gaun predicts, given that the company’s cloud computing offering isn’t even generally available yet.

     

     

    DoD Outlines Impact of ‘Irrational’ Budget Cuts

    Defense News

    Aug. 1, 2012 – 02:48PM |

    By RICK MAZE | Comments

    An Obama administration decision to exempt military personnel programs from potential across-the-board budget cuts in January does not mean troops and their families would feel no impact from the budget process known as sequestration.

    Testifying before the House Armed Services Committee, Deputy Defense Secretary Ashton Carter warned of potential widespread effects.

    For service members, retirees and families, Carter warned, reductions in health care funding would result in “delays in payments to service providers and, potentially, some denial of service” under the Tricare health care program.

     

    Reductions in operating funds also would result in reduced operating hours for commissaries, and Carter warned that cuts in construction and maintenance funds would lead to delays in new construction and in repairs for existing buildings, including schools and medical centers.

    Training would be cut for all of the services. Officials would try to minimize impact on deploying units by cutting more deeply into training for those not scheduled to deploy. “Some late-deploying units, including some deploying to Afghanistan, could receive less training, especially in the Army and Marine Corps. Under some circumstances, this reduced training could impact their ability to respond to a new contingency, should one occur,” Carter said.

    Air Force flying hours “could be reduced by several hours a month, and Navy steaming days could decline by several days a quarter,” he said.

    Lest anyone not fully grasp the implications, Carter emphasized: “The result will be reduced training and lower readiness.”

    Funds for Defense Department civilian personnel are not protected from sequestration, he added.

    “Although it is premature to describe in detail how sequester would impact the DoD work force, it might be necessary to impose a partial hiring freeze or unpaid furloughs,” he said.

    The White House budget director, also appearing at the hearing, said sequestration would result in a 10 percent reduction in defense spending and an 8 percent cut in nondefense spending, reductions that “would indiscriminately impact all programs without regard to priorities or function.”

    Jeff Zients, acting director of the White House’s Office of Management and Budget, said “no amount of planning will mitigate the effects.”

    Carter had a similar message. “We cannot devise a plan that eliminates or even substantially mitigates” the harmful impact of cuts.

    “Sequester denies rational planning,” he said. “It was designed to be irrational.”

     

    Math and science fields battle persistent gender gap

    By Mary Beth Marklein, USA TODAY

    August 1, 2012

    During a mock intestinal surgery exercise in a lab at the C.S. Mott Engineering and Science Center, 16-year-old Riley McGarry, left, of Oxford, Michigan, and and 17-year-old Tori Royale, of Charlotte, Michigan, practice a procedure inside a box to simulate real world laparoscopic surgery.

    On this July afternoon, she sewed up a 1-inch gash in a hot dog that she could see only by watching a videocamera, an exercise that simulated robotically assisted surgery and tested her hand-eye coordination.

    “I used to like building things when I was little, but I never expected robotics, something so stereotypically male, to be exciting,” says Lauryn, 16, of Grand Blanc, Mich.

    That’s encouraging news for people who are concerned about a persistent gender gap in college degrees in science, technology, engineering and math —STEM, for short. The notion that it might have to do with aptitude has long been dismissed. Yet research shows that girls who enjoy — and excel at — math and science in high school are less likely than boys to pursue a college major in those fields.

    And even if they start college majoring in a STEM field, women are more likely than men to change majors, federal data show. Women make up 24% of STEM jobs, which offers some of the most lucrative careers, a Commerce Department report says. More than half have degrees in the physical and life sciences.

     

    The Obama administration has stressed the importance of plugging that leaky pipeline, arguing that expertise in science, technology, engineering and math is crucial to nation’s global competitiveness.

    But some critics of the focus on gender parity say the concern is overblown. “There may be something about (certain math- and science-related) fields that are just less interesting to most women,” says American Enterprise Institute scholar Christina Hoff Sommers, editor of The Science on Women and Science, in which several contributors argue that biology is a factor. “No one is saying they’re not as talented, but they have different aspirations.”

    A number of colleges that specialize in these fields have recently taken steps to increase female enrollment and are seeing results:

    This spring, Worcester Polytechnic Institute in Worcester, Mass., graduated 247 women, its largest number of women ever, representing 30% of its graduating class. The percentage of women at Harvey Mudd College in Claremont, Calif., who graduate with a computer science degree (a field where national numbers for women are especially low), jumped from an average 12% in recent years to more than 26% in 2010, 43% in 2011 and 38% this year. The percentage of women earning engineering degrees at the historically black North Carolina A&T State University in Greensboro has climbed from 20% in 2004 to 45% in 2011.

    The schools’ strategies vary, but a key component is to create opportunities for women to apply their talents and skills to matters that appeal to them. Harvey Mudd College, for example, overhauled a required introductory computer science course to allow students more choice in how they apply principles being covered.

    “Many of the guys would be just as happy in a class that focused exclusively on video games and robots,” says department chair Ran Libeskind-Hadas. For women, “seeing the applications of computing to society is really important, and seeing applications of computing to things they use in their day-to-day lives is important. That includes things like, how does Google Maps work, or how does a music recommendation system work.”

    Worcester Polytechnic has put more emphasis on team-based projects that solve global problems such as irrigation and health care. “The issue was values-based,” says Kristin Tichenor, a senior vice president of enrollment. “They did not see engineering and the hard sciences as a mechanism for helping make the world a better place.”

    Not to be ignored is the school’s decision in 2007 to make SAT scores optional in admissions. Tichenor says math SAT scores were not accurately predicting the success of its female students. Historically, average math SAT scores for women have been lower than those for men.

    Celina Dopart, who graduated this spring from Worcester Polytechnic with a degree in aerospace engineering and is headed to the Massachusetts Institute of Technology this fall for graduate work, says she submitted her scores, but liked the message sent by the test-optional policy.

    “It showed that they were looking for the people themselves rather than the numbers,” she says.

     

    U.S. model for a future war fans tensions with China and inside Pentagon

    Washington Post

    By Greg Jaffe, Published: August 1

    When President Obama called on the U.S. military to shift its focus to Asia earlier this year, Andrew Marshall, a 91-year-old futurist, had a vision of what to do.

    Marshall’s small office in the Pentagon has spent the past two decades planning for a war against an angry, aggressive and heavily armed China.

    No one had any idea how the war would start. But the American response, laid out in a concept that one of Marshall’s longtime proteges dubbed “Air-Sea Battle,” was clear.

    Stealthy American bombers and submarines would knock out China’s long-range surveillance radar and precision missile systems located deep inside the country. The initial “blinding campaign” would be followed by a larger air and naval assault.

    The concept, the details of which are classified, has angered the Chinese military and has been pilloried by some Army and Marine Corps officers as excessively expensive. Some Asia analysts worry that conventional strikes aimed at China could spark a nuclear war.

    Air-Sea Battle drew little attention when U.S. troops were fighting and dying in large numbers in Iraq and Afghanistan. Now the military’s decade of battling insurgencies is ending, defense budgets are being cut, and top military officials, ordered to pivot toward Asia, are looking to Marshall’s office for ideas.

    In recent months, the Air Force and Navy have come up with more than 200 initiatives they say they need to realize Air-Sea Battle. The list emerged, in part, from war games conducted by Marshall’s office and includes new weaponry and proposals to deepen cooperation between the Navy and the Air Force.

    A former nuclear strategist, Marshall has spent the past 40 years running the Pentagon’s Office of Net Assessment, searching for potential threats to American dominance. In the process, he has built a network of allies in Congress, in the defense industry, at think tanks and at the Pentagon that amounts to a permanent Washington bureaucracy.

    While Marshall’s backers praise his office as a place where officials take the long view, ignoring passing Pentagon fads, critics see a dangerous tendency toward alarmism that is exaggerating the China threat to drive up defense spending.

    “The old joke about the Office of Net Assessment is that it should be called the Office of Threat Inflation,” said Barry Posen, director of the MIT Security Studies Program. “They go well beyond exploring the worst cases. . . . They convince others to act as if the worst cases are inevitable.”

    Marshall dismisses criticism that his office focuses too much on China as a future enemy, saying it is the Pentagon’s job to ponder worst-case scenarios.

    “We tend to look at not very happy futures,” he said in a recent interview.

     

    China tensions

    Even as it has embraced Air-Sea Battle, the Pentagon has struggled to explain it without inflaming already tense relations with China. The result has been an information vacuum that has sown confusion and controversy.

    Senior Chinese military officials warn that the Pentagon’s new effort could spark an arms race.

    “If the U.S. military develops Air-Sea Battle to deal with the [People’s Liberation Army], the PLA will be forced to develop anti-Air-Sea Battle,” one officer, Col. Gaoyue Fan, said last year in a debate sponsored by the Center for Strategic and International Studies, a defense think tank.

    Pentagon officials counter that the concept is focused solely on defeating precision missile systems.

     

    “It’s not about a specific actor,” a senior defense official told reporters last year. “It is not about a specific regime.”

    The heads of the Air Force and Navy, meanwhile, have maintained that Air-Sea Battle has applications even beyond combat. The concept could help the military reach melting ice caps in the Arctic Circle or a melted-down nuclear reactor in Japan, Adm. Jonathan Greenert, the U.S. chief of naval operations, said in May at the Brookings Institution.

    At the same event, Gen. Norton Schwartz, the Air Force chief, upbraided a retired Marine colonel who asked how Air-Sea Battle might be employed in a war with China.

    “This inclination to narrow down on a particular scenario is unhelpful,” Schwartz said.

    Privately, senior Pentagon officials concede that Air-Sea Battle’s goal is to help U.S. forces weather an initial Chinese assault and counterattack to destroy sophisticated radar and missile systems built to keep U.S. ships away from China’s coastline.

    Their concern is fueled by the steady growth in China’s defense spending, which has increased to as much as $180 billion a year, or about one-third of the Pentagon’s budget, and China’s increasingly aggressive behavior in the South China Sea.

    “We want to put enough uncertainty in the minds of Chinese military planners that they would not want to take us on,” said a senior Navy official overseeing the service’s modernization efforts. “Air-Sea Battle is all about convincing the Chinese that we will win this competition.”

    Like others quoted in this article, the official spoke on the condition of anonymity because of the sensitivity of the subject.

     

    A military tech ‘revolution’

    Air-Sea Battle grew out of Marshall’s fervent belief, dating to the 1980s, that technological advancements were on the verge of ushering in a new epoch of war.

    New information technology allowed militaries to fire within seconds of finding the enemy. Better precision bombs guaranteed that the Americans could hit their targets almost every time. Together these advances could give conventional bombs almost the same power as small nuclear weapons, Marshall surmised.

    Marshall asked his military assistant, a bright officer with a Harvard doctorate, to draft a series of papers on the coming “revolution in military affairs.” The work captured the interest of dozens of generals and several defense secretaries.

    Eventually, senior military leaders, consumed by bloody, low-tech wars in Iraq and Afghanistan, seemed to forget about Marshall’s revolution. Marshall, meanwhile, zeroed in on China as the country most likely to exploit the revolution in military affairs and supplant the United States’ position as the world’s sole superpower.

    In recent years, as the growth of China’s military has outpaced most U.S. intelligence projections, interest in China as a potential rival to the United States has soared.

    “In the blink of an eye, people have come to take very seriously the China threat,” said Andrew Hoehn, a senior vice president at Rand Corp. “They’ve made very rapid progress.”

    Most of Marshall’s writings over the past four decades are classified. He almost never speaks in public and even in private meetings is known for his long stretches of silence.

     

    His influence grows largely out of his study budget, which in recent years has floated between $13 million and $19 million and is frequently allocated to think tanks, defense consultants and academics with close ties to his office. More than half the money typically goes to six firms.

    Among the largest recipients is the Center for Strategic and Budgetary Assessments, a defense think tank run by retired Lt. Col. Andrew Krepinevich, the Harvard graduate who wrote the first papers for Marshall on the revolution in military affairs.

    In the past 15 years, CSBA has run more than two dozen China war games for Marshall’s office and written dozens of studies. The think tank typically collects about $2.75 million to $3 million a year, about 40 percent of its annual revenue, from Marshall’s office, according to Pentagon statistics and CSBA’s most recent financial filings.

    Krepinevich makes about $865,000 in salary and benefits, or almost double the compensation paid out to the heads of other nonpartisan think tanks such as the Center for Strategic and International Studies and the Brookings Institution. CSBA said its board sets executive compensation based on a review of salaries at other organizations doing similar work.

    The war games run by CSBA are set 20 years in the future and cast China as a hegemonic and aggressive enemy. Guided anti-ship missiles sink U.S. aircraft carriers and other surface ships. Simultaneous Chinese strikes destroy American air bases, making it impossible for the U.S. military to launch its fighter jets. The outnumbered American force fights back with conventional strikes on China’s mainland, knocking out long-range precision missiles and radar.

    “The fundamental problem is the same one that the Soviets identified 30 years ago,” Krepinevich said in an interview. “If you can see deep and shoot deep with a high degree of accuracy, our large bases are not sanctuaries. They are targets.”

    Some critics doubt that China, which owns $1.6 trillion in U.S. debt and depends heavily on the American economy, would strike U.S. forces out of the blue.

    “It is absolutely fraudulent,” said Jonathan D. Pollack, a senior fellow at Brookings. “What is the imaginable context or scenario for this attack?”

    Other defense analysts warn that an assault on the Chinese mainland carries potentially catastrophic risks and could quickly escalate to nuclear armageddon.

    The war games elided these concerns. Instead they focused on how U.S. forces would weather the initial Chinese missile salvo and attack.

    To survive, allied commanders dispersed their planes to austere airfields on the Pacific islands of Tinian and Palau. They built bomb-resistant aircraft shelters and brought in rapid runway repair kits to fix damaged airstrips.

    Stealthy bombers and quiet submarines waged a counterattack. The allied approach became the basis for the Air-Sea Battle.

     

    Think tank’s paper

    Although the Pentagon has struggled to talk publicly about Air-Sea Battle, CSBA has not been similarly restrained. In 2010, it published a 125-page paper outlining how the concept could be used to fight a war with China.

    The paper contains less detail than the classified Pentagon version. Shortly after its publication, U.S. allies in Asia, frustrated by the Pentagon’s silence on the subject, began looking to CSBA for answers.

     

    “We started to get a parade of senior people, particularly from Japan, though also Taiwan and to a lesser extent China, saying, ‘So, this is what Air-Sea Battle is,’ ” Krepinevich said this year at an event at another think tank.

    Soon, U.S. officials began to hear complaints.

    “The PLA went nuts,” said a U.S. official who recently returned from Beijing.

    Told that Air-Sea Battle was not aimed at China, one PLA general replied that the CSBA report mentioned the PLA 190 times, the official said. (The actual count is closer to 400.)

    Inside the Pentagon, the Army and Marine Corps have mounted offensives against the concept, which could lead to less spending on ground combat.

    An internal assessment, prepared for the Marine Corps commandant and obtained by The Washington Post, warns that “an Air-Sea Battle-focused Navy and Air Force would be preposterously expensive to build in peace time” and would result in “incalculable human and economic destruction” if ever used in a major war with China.

    The concept, however, aligns with Obama’s broader effort to shift the U.S. military’s focus toward Asia and provides a framework for preserving some of the Pentagon’s most sophisticated weapons programs, many of which have strong backing in Congress.

    Sens. Joseph I. Lieberman (I-Conn.) and John Cornyn (R-Tex.) inserted language into the 2012 Defense Authorization bill requiring the Pentagon to issue a report this year detailing its plans for implementing the concept. The legislation orders the Pentagon to explain what weapons systems it will need to carry out Air-Sea Battle, its timeline for implementing the concept and an estimate of the costs associated with it.

    Lieberman and Cornyn’s staff turned to an unsurprising source when drafting the questions.

    “We asked CSBA for help,” one of the staffers said. “In a lot of ways, they created it.”

    http://www.washingtonpost.com/world/national-security/us-model-for-a-future-war-fans-tensions-with-china-and-inside-pentagon/2012/08/01/gJQAC6F8PX_print.html

     

     

    Senate confirms General Welsh as next Air Force chief of staff

     

    Posted 8/2/2012

    8/2/2012 – WASHINGTON (AFNS) — Gen. Mark A. Welsh III, a 36-year veteran, will be the twentieth Air Force chief of staff, according to a Senate confirmation vote Aug. 2.

    Welsh provided testimony to the Senate Armed Services Committee July 19, before the full Senate voted on his confirmation. Now the president must appoint the general before he can assume his new position.

    “I fully accept the responsibility to stand beside Secretary of the Air Force Michael Donley and lead the 690,000 active, Guard, Reserve and civilian Airmen who selflessly serve our nation as part of an unbeatable joint team,” Welsh said.

    Pending the president’s appointment, Welsh, who served as the U.S. Air Forces in Europe commander, will assume the position of Air Force chief of staff from Gen. Norton Schwartz in a ceremony Aug. 10.

     

    In his previous position, Welsh’s command was responsible for Air Force activities, conducted through 3rd Air Force, in an area of operations covering almost one-fifth of the globe. This area includes 51 countries in Europe, Asia and the Middle East, and the Arctic and Atlantic oceans with a total population reaching nearly one billion people speaking more than 80 languages. He also had administrative control of U.S. Air Forces Africa, providing support, logistics and resources to U.S. Africa Command.

    Welsh entered the Air Force in June 1976 as a graduate of the U.S. Air Force Academy. He has been assigned to numerous operational, command and staff positions during his career.

     

    We now have our smallest government in 45 years

    By Jordan Weissmann

    3:29 PM ET

    Since the official end of the Great Recession, America’s public sector has shrunk. And shrunk. And shrunk some more. We’ve said goodbye to about 600,000 government jobs, handing the economy a nasty self-inflicted wound in the process.

    But how small has our public sector really become? Here’s one way to think about it: Compared to our population, it hasn’t been this size since 1968. Your dreams are coming true Baby Boomers. We’re almost all the way back to the Summer of Love!

    First, credit where it’s due. The Hamilton Project has produced a beautiful graph illustrating the government employment to population ratio. As it shows, there are now fewer public sector employees per American than at any time dating back to the Carter administration (To be clear, we’re talking state, federal, and local).

     

     

     

    .

    Read more of this story at The Atlantic.

    http://www.govexec.com/management/2012/08/we-now-have-our-smallest-government-50-years/57221/

     

    Federal travelers could have fewer lodging options

    NextGov

    By Kellie Lunney

    2:07 PM ET

    Per diem rates for federal travelers could take a dive in fiscal 2013.

    The General Services Administration reportedly is considering changes to the methodology it uses to calculate lodging rates for government travelers. The review is part of a governmentwide effort to comply with a May memorandum from the Office of Management and Budget that directs agencies to spend at least 30 percent less on travel than they did in fiscal 2010 for the next three fiscal years.

    GSA has not made a decision yet on the new per diem rates, said GSA Deputy Press Secretary Dan Cruz, adding that GSA will implement the May OMB directive on agency travel and conferences, including decreasing spending by 30 percent compared to fiscal 2010. “We will continue to engage our industry partners as we undertake this review of per diem rates,” Cruz said by email.

    GSA will announce the fiscal 2013 per diem rates by Sept. 1, and they will take effect Oct. 1.

    Hotel and lodging industry officials have said lower per diem rates will mean fewer choices for federal travelers, particularly in more expensive urban areas. GSA is considering whether it will ditch pricier hotels from its roster, lowering the overall per diem average, according to a July 5 report in Federal Times.

    GSA establishes per diem rates for lodging and meal expenses in the continental United States. A standard per diem is applied in locations less commonly traveled by federal workers, while nonstandard areas frequently visited are granted individual rates based on the average daily industry rate. The standard lodging rate, which covers hotels in 2,600 counties nationwide, is $77 a night. Lodging per diem rates vary according to region and time of year, however. For example, 2012 rates for Washington range from $183 to $226; in New York City, the lodging per diem ranges from $204 to $295.

    GSA has been under fire during the past several months for lavish spending on conferences and other travel billed to taxpayers. Earlier this week, House lawmakers blasted the agency during a congressional hearing over charges that it awarded excessive bonuses to employees.

    http://www.govexec.com/pay-benefits/2012/08/federal-travelers-could-have-fewer-lodging-options/57215/

     

    FAA urged to keep UAV timetable for test sites

    Dayton Daily News

    August 3, 2012    

    By John Nolan

    Staff Writer

     

    Congress, concerned about a delay that has already occurred, is pressing the Federal Aviation Administration to stay on schedule for designating six test sites in December that will allow the operation of remotely piloted aircraft in airspace now used by manned planes, to determine how the flying can be safely integrated.

    Reps. Steve Austria and Mike Turner, both Republicans from the Dayton area, were among nine House members who signed the letter sent on Wednesday to the FAA’s interim administrator, Michael Huerta.

    The legislators noted that the FAA’s own timetable for responding to a law Congress passed to mandate the airspace integration of manned and unmanned aircraft had called for the agency to issue a request for proposals in July for management of the test sites. It is early August and congressional staff have heard from contacts within the FAA that no such request for proposals is imminent, leading legislators to worry that there could be further delays and the FAA won’t be ready to decide the locations of the test sites in December, Austria said Thursday.

    “We want assurance that they will stick to that timetable,” said Austria, R-Beavercreek, who is a member of the Unmanned Systems Congressional Caucus. The lawmakers are monitoring whether the FAA complies with requirements of legislation that requires the integration of unmanned aerial vehicles into manned airspace. Under the schedule, the first of the six test sites would begin operating in January 2013.

    “The agency is working to complete the proposal process for the six test sites as required by the 2012 FAA Reauthorization Act,” the FAA, without elaborating, said in a written response to the Dayton Daily News.

    At least 30 states are considered likely competitors for the test sites, which could boost job creation by attracting more companies with a stake in research and development of UAVs and the sensor packages they carry. Beyond military uses, the UAVs are in demand for civilian markets including law enforcement, homeland security, emergency response and agriculture.

    The Dayton Development Coalition and Ohio officials are ready to respond whenever the FAA issues its request for proposals, said Joe Zeis, the coalition’s executive vice president.

    The Dayton region is home to Wright-Patterson Air Force Base, UAV and sensors research and development, a UAV mission at Springfield’s Ohio Air National Guard base, and aerospace, propulsion and advanced manufacturing expertise, along with courses designed to train UAV pilots. Regional leaders have declined to project how many jobs could result if an FAA-designated test site were nearby, but have said it would be a major boost for the local industry.

     

    A Day in the Life of a Drone Operator

    A Day Job Waiting for a Kill Shot a World Away

    NYTimes

    July 29, 2012

    By ELISABETH BUMILLER

    HANCOCK FIELD AIR NATIONAL GUARD BASE, N.Y. — From his computer console here in the Syracuse suburbs, Col. D. Scott Brenton remotely flies a Reaper drone that beams back hundreds of hours of live video of insurgents, his intended targets, going about their daily lives 7,000 miles away in Afghanistan. Sometimes he and his team watch the same family compound for weeks.

    “I see mothers with children, I see fathers with children, I see fathers with mothers, I see kids playing soccer,” Colonel Brenton said.

    When the call comes for him to fire a missile and kill a militant — and only, Colonel Brenton said, when the women and children are not around — the hair on the back of his neck stands up, just as it did when he used to line up targets in his F-16 fighter jet.

    Afterward, just like the old days, he compartmentalizes. “I feel no emotional attachment to the enemy,” he said. “I have a duty, and I execute the duty.”

    Drones are not only revolutionizing American warfare but are also changing in profound ways the lives of the people who fly them.

    Colonel Brenton acknowledges the peculiar new disconnect of fighting a telewar with a joystick and a throttle from his padded seat in American suburbia.

    When he was deployed in Iraq, “you land and there’s no more weapons on your F-16, people have an idea of what you were just involved with.” Now he steps out of a dark room of video screens, his adrenaline still surging after squeezing the trigger, and commutes home past fast-food restaurants and convenience stores to help with homework — but always alone with what he has done.

    “It’s a strange feeling,” he said. “No one in my immediate environment is aware of anything that occurred.”

    Routinely thought of as robots that turn wars into sanitized video games, the drones have powerful cameras that bring war straight into a pilot’s face.

    Although pilots speak glowingly of the good days, when they can look at a video feed and warn a ground patrol in Afghanistan about an ambush ahead, the Air Force is also moving chaplains and medics just outside drone operation centers to help pilots deal with the bad days — images of a child killed in error or a close-up of a Marine shot in a raid gone wrong.

     

    Among the toughest psychological tasks is the close surveillance for aerial sniper missions, reminiscent of the East German Stasi officer absorbed by the people he spies on in the movie “The Lives of Others.” A drone pilot and his partner, a sensor operator who manipulates the aircraft’s camera, observe the habits of a militant as he plays with his children, talks to his wife and visits his neighbors. They then try to time their strike when, for example, his family is out at the market.

    “They watch this guy do bad things and then his regular old life things,” said Col. Hernando Ortega, the chief of aerospace medicine for the Air Education Training Command, who helped conduct a study last year on the stresses on drone pilots. “At some point, some of the stuff might remind you of stuff you did yourself. You might gain a level of familiarity that makes it a little difficult to pull the trigger.”

    Of a dozen pilots, sensor operators and supporting intelligence analysts recently interviewed from three American military bases, none acknowledged the kind of personal feelings for Afghans that would keep them awake at night after seeing the bloodshed left by missiles and bombs. But all spoke of a certain intimacy with Afghan family life that traditional pilots never see from 20,000 feet, and that even ground troops seldom experience.

    “You see them wake up in the morning, do their work, go to sleep at night,” said Dave, an Air Force major who flew drones from 2007 to 2009 at Creech Air Force Base in Nevada and now trains drone pilots at Holloman Air Force Base in New Mexico. (The Air Force, citing what it says are credible threats, forbids pilots to disclose their last names. Senior commanders who speak to the news media and community groups about the base’s mission, like Colonel Brenton in Syracuse, use their full names.)

    Some pilots spoke of the roiling emotions after they fire a missile. (Only pilots, all of them officers, employ weapons for strikes.)

    “There was good reason for killing the people that I did, and I go through it in my head over and over and over,” said Will, an Air Force officer who was a pilot at Creech and now trains others at Holloman. “But you never forget about it. It never just fades away, I don’t think — not for me.”

    The complexities will only grow as the military struggles to keep up with a near insatiable demand for drones. The Air Force now has more than 1,300 drone pilots, about 300 less than it needs, stationed at 13 or more bases across the United States. They fly the unmanned aircraft mostly in Afghanistan. (The numbers do not include the classified program of the C.I.A., which conducts drone strikes in Pakistan, Somalia and Yemen.) Although the Afghan war is winding down, the military expects drones to help compensate for fewer troops on the ground.

    By 2015, the Pentagon projects that the Air Force will need more than 2,000 drone pilots for combat air patrols operating 24 hours a day worldwide. The Air Force is already training more drone pilots — 350 last year — than fighter and bomber pilots combined. Until this year, drone pilots went through traditional flight training before learning how to operate Predators, Reapers and unarmed Global Hawks. Now the pilots are on a fast track and spend only 40 hours in a basic Cessna-type plane before starting their drone training.

    Gen. Norton A. Schwartz, the Air Force chief of staff, said it was “conceivable” that drone pilots in the Air Force would outnumber those in cockpits in the foreseeable future, although he predicted that the Air Force would have traditional pilots for at least 30 more years.

    Many drone pilots once flew in the air themselves but switched to drones out of a sense of the inevitable — or if they flew cargo planes, to feel closer to the war. “You definitely feel more connected to the guys, the battle,” said Dave, the Air Force major, who flew C-130 transport planes in Iraq and Afghanistan.

    Now more and more Air National Guard bases are abandoning traditional aircraft and switching to drones to meet demand, among them Hancock Field, which retired its F-16s and switched to Reapers in 2010. Colonel Brenton, who by then had logged more than 4,000 hours flying F-16s in 15 years of active duty and a decade in Syracuse deploying to war zones with the Guard, said he learned to fly drones to stay connected to combat. True, drones cannot engage in air-to-air combat, but Colonel Brenton said that “the amount of time I’ve engaged the enemy in air-to-ground combat has been significant” in both Reapers and F-16s.

    “I feel like I’m doing the same thing I’ve always done, I just don’t deploy to do it,” he said. Now he works full time commanding a force of about 220 Reaper pilots, sensor operators and intelligence analysts at the base.

    Pilots say the best days are when ground troops thank them for keeping them safe. Ted, an Air Force major and an F-16 pilot who flew Reapers from Creech, recalled how troops on an extended patrol away from their base in Afghanistan were grateful when he flew a Reaper above them for five hours so they could get some sleep one night. They told him, “We’re keeping one guy awake to talk to you, but if you can, just watch over and make sure nobody’s sneaking up on us,” he recalled.

    All the operators dismiss the notion that they are playing a video game. (They also reject the word “drone” because they say it describes an aircraft that flies on its own. They call their planes remotely piloted aircraft.)

    “I don’t have any video games that ask me to sit in one seat for six hours and look at the same target,” said Joshua, a sensor operator who worked at Creech for a decade and is now a trainer at Holloman. “One of the things we try to beat into our crews is that this is a real aircraft with a real human component, and whatever decisions you make, good or bad, there’s going to be actual consequences.”

    In his 10 years at Creech, he said without elaborating, “I’ve seen some pretty disturbing things.”

    All of the pilots who once flew in cockpits say they do miss the sensation of flight, which for Colonel Brenton extends to the F-16 flybys he did for the Syracuse Memorial Day parade downtown. To make up for it, he sometimes heads out on weekends in a small propeller plane, which he calls a bug smasher.

    “It’s nice to be up in the air,” he said.

     

    Pay Raises and COLAs

    GovExec

    By Tammy Flanagan

    August 3, 2012

    12:00 AM ET

    Last week we looked back on 25 years of laws that have changed the way federal retirement benefits are administered. This week, with a potential third year of a federal pay freeze looming, let’s take a look at salaries and how they compare with cost-of-living adjustments for federal retirees.

    In addition to weighing legislation directly affecting retirement and insurance benefits, each year Congress also considers annual pay adjustments for current employees. COLAs for retirees are governed by a formula set in law. They are determined in reference to the Consumer Price Index for Urban Wage Earners and Clerical Workers (CPI-W), as calculated by the Bureau of Labor Statistics.

    As of the end of June, the CPI-W had increased 1.6 percent in the previous 12 months. The 2013 COLA for retirees will be determined based on the results posted at the end of September 2012. That means price increases and decreases during July, August and September will affect the COLA that retirees receive in their Civil Service Retirement System and Federal Employees Retirement System benefit checks payable on Jan. 1, 2013. The announcement of the 2012 COLA for retirees is made in mid-October.

    To constrain retirement costs, Congress has placed restrictions on COLAs to FERS retirees. FERS provides COLAs to retirees younger than age 62 only if they are subject to special provisions (covering, for example, law enforcement officers and firefighters), are disabled or are survivor annuitants. FERS retirees 62 and older receive a full COLA only if the CPI increases by 2 percent or less. FERS retirees receive a 2 percent COLA if the CPI increase is between 2 percent and 3 percent. If the CPI increases by 3 percent or more, then the FERS COLA is 1 percentage point less than the change in the CPI.

    Increases in pay for federal, on the other hand, are based on different economic (and political) variables. Pay raises for civilian federal workers are nominally indexed to wage and salary increases in the private sector, as measured by the Employment Cost Index. But as a practical matter, civilian pay tends to be based on legislative agreements worked out in Congress.

    So how have raises and retiree COLAs compared over the years? In the chart below, you can see the annual employee pay adjustment every year since 1970 compared to the annual retiree cost-of-living adjustment. As you can see, the two can differ greatly. Sometimes, like this year, retirees come out ahead. At other times, the across-the-board employee pay raise is higher.

    http://www.govexec.com/pay-benefits/retirement-planning/2012/08/pay-raises-and-colas/57196/

From → Uncategorized

Leave a Comment

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

%d bloggers like this: