Skip to content

August 17 2013

August 19, 2013

17August2013

Newswire

 

DoD weighs major COCOM realignment

Aug. 11, 2013 – 09:49AM |

http://www.armytimes.com/article/20130811/NEWS05/308110004/DoD-weighs-major-COCOM-realignment

By Marcus Weisgerber

Staff writer

 

WASHINGTON — The Pentagon is considering a major overhaul of its geographical combatant commands, possibly realigning oversight within hot-button areas of the world and eliminating thousands of military and civilian positions,according to defense sources..

While the plans for combatant command (COCOM) realignment and consolidation are still notional, sources say some options include:

■ Combining Northern Command and Southern Command to form what what some are calling “Americas Command” or “Western Command.”

■ Dissolving Africa Command and splitting it up among European Command and Central Command.

■ Expanding Pacific Command to include Afghanistan and Pakistan, which are part of Central Command.

In all, the realignments could shutter two COCOMs and eight service-supporting commands, totaling more than 5,000 people both uniformed and civilian.

Defense Secretary Chuck Hagel for the first time hinted at the consolidations of the COCOMs during a July 31 press conference when he announced significant budget-cutting options the Defense Department would have to make should federal spending cuts remain in place across the decade.

Defense officials would not comment on specific consolidation plans being considered.

The sequester is forcing the Pentagon to look for ways to cut spending quickly. Shuttering a COCOM would impact U.S. relations abroad, and underscores the need to eliminate the budget caps, a defense official said.

“Combining combatant commands is certainly not something that we want to do, but something that we have to consider because all cuts have to be on the table,” the official said.

Members of the Joint Staff and other defense officials have been exploring options for COCOM realignment since last year, according to sources.

Regional experts agree the Pentagon could reorganize its combatant commands to better align the headquarters with long-term strategic goals.

Combining Northern and Southern commands could lead to greater resources for activities in South and Central America, which experts say has long been DoD’s most neglected region.

 

Combining the regions could better address cross border issues — particularly drug trafficking — between Mexico, South America and the United States, said Bob Killebrew, a retired Army colonel and senior fellow at the Center for a New American Security.

Mexico is part of Northern Command, which also includes the contiguous United States, Alaska and Canada.

“[I]t makes … sense not to have a kind of artificial DoD boundary, not only between Mexico and Central America, but between Mexico and the American border as well,” Killebrew said.

Organizing oversight of Africa has been a topic of debate — mostly in the academic community — ever since Africa Command split from European Command and became a stand-alone COCOM in 2008. Before that, European Command oversaw much of the continent, with Central Command overseeing the Horn of Africa.

“The [oversight] that was diffused over multiple commands really wasn’t something that was in our best interest nor in the best interest of our partners on the continent,” said Kip Ward, a retired Army general who was the first commander of Africa Command.

Major changes to the existing Africa Command construct are not likely during a COCOM reorganization, experts say. US military operations in Africa, ranging from the 2011 overthrow of Moammar Gadhafi in Libya to the recent campaign against terrorists in Mali, underscore the need for a dedicated COCOM, defense officials say.

Since its establishment, Africa Command has added value and has been well received on the continent, Ward said.

“I think that the focus that AFRICOM is able to bring to that vital, important part of the world is still important,” he said.

Meanwhile, experts agree that Afghanistan, Pakistan and India should fall under the same COCOM, regardless of whether it’s Pacific or Central. India falls under Pacific Command while Pakistan and Afghanistan are part of Central Command.

Since security, foreign policy, economic and trade issues with India predominantly involve Pakistan and vice versa, placing them under the same COCOM could better streamline U.S. military ties with each country, some experts say.

The same is true for security and policy issues involving Afghanistan, since much of the violence in the nation is along the Pakistani border.

“It’s better that the people who are dealing with India are the ones that are at least fully aware of, or completely in the picture, of what was discussed between two top dignitaries between the U.S. and Pakistan,” said Hassan Abbas, a former Pakistani government official who served in the administrations of Prime Minister Benazir Bhutto and President Pervez Musharraf.

“If they are the same team, or the same group of people that are in the same institution who are dealing with India, it actually provides more leverage for the United States and more opportunities to go for a coherent policy rather than dealing through two different commands,” said Abbas, a senior adviser and fellow at the Asia Society and a professor at National Defense University’s College of International Security Affairs. He stressed that this is his personal view and he was not speaking for the university or the U.S. government. “I think this makes sense.”

 

Budget consolidation

Federal budget caps would cut about $500 billion from planned defense spending over the next decade. The caps have already cut $37 billion from the Pentagon’s 2013 budget.

 

With that in mind, Hagel on July 31 announced the findings of the four-month-long Strategic Choices and Management Review, an effort that examined options the Pentagon could take to meet those spending targets, while trying to achieve the goals of its Pacific-focused military strategy.

If the caps remain in place across the decade, “additional consolidations and mission reductions,” such as “consolidations of regional combatant commands, defense agency mission cuts, and further IT consolidation” might be necessary, Hagel said.

“These changes would be far-reaching and require further analysis and consideration,” he said. “Though defense bureaucracies are often derided, the fact is that these offices perform functions needed to manage, administer and support a military of our size, complexity and global reach.”

The actual COCOM realignments would be laid out in the Unified Command Plan, a document that “establishes the missions, responsibilities and geographic areas of responsibility” for COCOM.

The Unified Command Plan — which originated in late the 1940s — is usually reviewed every two years and was last updated in April 2011.

The Pentagon last shuttered a major command in 2010 when it closed U.S. Joint Forces Command, in an attempt to trim overhead. But many of the positions associated with that command remained as the Joint Staff absorbed nearly 3,000 uniformed and civilian workers.

The growth in headquarters staff sizes at the COCOMs and across the entire military has been a major issue of late.

Between 2010 and 2012, staff sizes at the six geographical COCOMs have increased more than 7 percent, adding nearly 1,000 civilian and military positions.

In addition to those numbers, each service operates its own subordinate commands to the individual COCOMs. Many positions at the service commands are redundant to positions at the COCOMs, according to the Government Accountability Office.

“If they were to streamline or combine [COCOMs], you certainly won’t have as many components,” said Arnold Punaro, a retired Marine Corps major general and former Senate staffer who chairs the Reserve Forces Policy Board. “It depends on how they do it.”

Hagel announced this summer a plan to cut COCOM headquarters and Pentagon staffs by 20 percent and reduce duplication.

Pentagon officials overseeing the reorganization should also consider changing the title of the four-star generals and admirals who oversee these regions from combatant commanders to something less invasive, Killebrew said.

“It’s a horrible [title] because what most of these commanders do is military assistance and military cooperation with other countries,” Killebrew said. “When you say you’re a combatant commander the first thing somebody in a small country says is, ‘But I don’t want to go to war. I just want to talk.’ ”

This is especially the case when dealing with some African and Central and South American nations.

A combatant commander used to hold the title commander in chief, or CINC, a title that former Defense Secretary Donald Rumsfeld abolished in 2002.

Killebrew said the title should be changed to “unified commander or something that implies working with other countries and not invading them.”

Paul McLeary contributed to this report.

 

Sequester May End Civilian Pensions for Military Retirees

GovExec

http://www.govexec.com/pay-benefits/pay-benefits-watch/2013/08/sequester-may-end-civilian-pensions-military-retirees/68282/

By Eric Katz

August 8, 2013

Civilians at the Defense Department have had a rough go of it lately, thanks in large part to sequestration.

They recently received some good news, however, as Secretary Chuck Hagel shaved the number of furloughs each worker must take, from 11 down to six. The celebrations may be short lived.

The Pentagon is hoping to avoid furloughs in fiscal 2014, but may resort to mandatory layoffs if sequestration remains in effect. The department would have to cut $52 billion from its budget next year if the spending caps stay, an increasingly likely prospect. In the long term, however — current law defines sequestration as a 10-year program — Defense will have to cut $250 billion over the next decade, and simple reductions in force will not suffice.

Hagel recently laid out various budget scenarios in the department’s Strategic Choices and Management Review, including proposals for dealing with sequestration. One suggestion: eliminating civilian pensions for retired military personnel working in the civil service.

Currently, the approximately 134,000 military retirees working for the Pentagon may forfeit their military retirement pay if they wish to apply their years of military service toward their civilian pension. Alternatively, they can keep their military retirement pay and start their service credit fresh when beginning their civilian careers. The new proposal would strip these options from military retirees, who would instead have to rely on their military retirement payments alone.

The plan — which, for now, remains very much in the hypothetical stage — would save $100 billion over ten years, Hagel said, when combined with ending subsidies for defense commissaries and restricting the availability of unemployment benefits.

Although they would have a “significant impact” on the Pentagon’s workforce, Hagel said “a sequester-level scenario would compel us to consider these changes, because there would be no realistic alternative that did not pose unacceptable risks to national security.”

Cuts to TRICARE Prime Proceed, Despite Protests

Over the last few months, Government Executive has tracked a Defense Department plan to phase out TRICARE Prime — the cheaper health care alternative to TRICARE Standard — for certain military retirees and their dependents. 

The move will affect more than 170,000 Prime enrollees who live more than 40 miles from a military clinic or hospital. The Pentagon will automatically transfer those beneficiaries to TRICARE Standard beginning Oct. 1, increasing their fees by about $240 a year on average. “Prolonged protests” had delayed the plan’s implementation, the Defense Department said.

Enrollees who live outside the acceptable radius for Prime, but within 100 miles of a military  facility, may have a one-time opportunity to stay enrolled in the economy health care plan, depending on the availability of services.

Little Victories

While the outlook for former military personnel is, in recent months, mostly marked by disheartening headlines, at least one subsect received positive news this week.

Both chambers of Congress have passed the Helping Heroes Fly Act, introduced in the House by Rep. Tulsi Gabbard, D-Hawaii. The bill will — once it receives President Obama’s signature — improve and make permanent the Wounded Warrior Screening Program, which requires the Transportation Security Administration to offer “sensitive screening of severely injured or disabled service members and veterans.”

“There is nothing more frustrating than to see these heroes returning home after defending our nation only to have to go through secondary screening in our airports. It’s offensive and insulting,” said Rep. Michael McCaul, R-Texas, House Homeland Security Committee chairman. “The Helping Heroes Fly Act will put an end to this and treat our wounded warriors with the dignity they deserve.”

 

MSPB Indefinitely Delays Processing of Defense Furlough Appeals

 

By Kellie Lunney

http://www.govexec.com/pay-benefits/2013/08/mspb-indefinitely-delays-processing-defense-furlough-appeals/68558/

August 12, 2013

The Merit Systems Protection Board has delayed processing and adjudication of furlough appeals from Defense Department employees until it can get a better handle on them.

Approximately 30,400 furlough challenges have poured into the small federal agency to date this fiscal year; of those, about 96 percent so far are from Defense employees forced to take unpaid leave because of sequestration. MSPB, which has a little more than 200 employees in Washington and in eight regional offices nationwide, called the number of appeals “unprecedented” and “overwhelming.” The agency will continue to process as usual furlough appeals from employees of non-Defense agencies.

The delay will allow MSPB to glean “a better understanding of the claims being raised, the work locations of the appellants, the MSPB offices in which the appeals have been filed, and whether or not the appellants are represented,” the agency said in a statement on its website. The board encouraged employees to file appeals electronically if they can, and to check online periodically for updates.

Among its other responsibilities, MSPB adjudicates appeals of “adverse personnel actions” from federal employees who’ve been fired, suspended for more than 14 days, furloughed for 30 days or less, demoted or had their pay cut. Agencies must give furloughed employees 30 days’ advance notice; once on furlough, employees have 30 days to file an appeal with MSPB. The agency’s regional offices received about 6,000 appeals during fiscal 2012; as of Monday, the regional offices received approximately 35,000 appeals, the bulk of which are furlough appeals. And there’s still more than a month left in fiscal 2013.

“As you can imagine, our regional offices have been overwhelmed, but our employees are working hard and doing the best they can do under difficult circumstances,” said Bryan Polisuk, MSPB general counsel, in an email. MSPB administrative judges have issued 16 decisions so far in furlough appeal cases, all from employees of the Federal Aviation Administration and the Environmental Protection Agency. The judges agreed with the agency’s decision to furlough the employees in each of those cases.

 

MSPB is docketing Defense furlough appeals as they come in, and then notifying employees that it will contact them “at a later date” with information on the status of their appeals. “MSPB will also be unable to respond quickly to inquiries regarding these furlough appeals,” the website statement said. “We ask therefore that parties to DoD furlough appeals refrain from contacting MSPB’s regional and field offices until we inform you that processing of your appeal has begun.” The board also sent letters in July to the general counsels of the Navy, Air Force and Army informing them of the processing delay related to Defense furlough appeals.

The agency’s regional offices have docketed 10,700 furlough appeals to date, Polisuk said. “The docketing process itself takes some time,” he said. “We’re hoping to be in a position soon to make decisions on how to move forward and adjudicate these cases in the most efficient manner possible, but again, it will depend on the volume of appeals.”

Defense furloughed 650,000 civilian employees this fiscal year because of sequestration. The department originally told employees that they would have to take 22 days of unpaid leave through Sept. 30; that number now is down to six days.

MSPB officials have worried since before the mandatory budget cuts took effect in March that the tiny, quasi-judicial agency could be flooded with appeals from furloughed employees across government. The only other time something like this happened, according to officials, was in the 1980s when the agency had to process 12,000 personnel-related appeals during the air traffic controller strike. That took two years, and at the time, MSPB had about double the number of employees it has now.

The average appeal processing time in fiscal 2012 was 93 days for an initial decision from the agency. The losing party can then file a petition of review with the agency’s three-member board in Washington. It took an average of 245 days to process those petitions in fiscal 2012.

 

 

Dalai Lama’s Chinese website infecting visitors, expert warns

By Jim Finkle

http://news.yahoo.com/dalai-lamas-chinese-website-infecting-visitors-expert-warns-192049529.html

BOSTON (Reuters) – A prominent computer security firm has warned that the Dalai Lama’s Chinese-language website has been compromised with malicious software that is infecting computers of visitors with software that could be used for spying on its visitors.

Kaspersky Lab researcher Kurt Baumgartner told Reuters that he is advising web surfers to stay away from the Chinese-language site of the Central Tibetan Administration’s site until the organization fixes the bug.

He said he believes the group behind the campaign was also behind previous breaches on the site that have gone unreported as well as attacks on websites belonging to groups that focus on human rights in Asia.

Officials with the Office of Tibet in New York could not be reached for comment. That office houses the Dalai Lama’s official representative to the United States.

Baumgartner said that the Chinese-language site of the Central Tibetan Administration, which is the official organ of the Dalai Lama’s government in exile, has been under constant attack from one group of hackers since 2011, though breaches have been quietly identified and repaired before garnering public attention.

 

“They have been trying repeatedly to find vulnerabilities in the site,” he said.

He said that it is safe to visit the group’s English and Tibetan sites.

He said he believes the same group of attackers has repeatedly infected the site with malicious software that automatically drops viruses on computers running Microsoft Corp’s Windows and Apple Inc’s Mac operating systems. They infect machines by exploiting security bugs in Oracle Corp’s Java software.

That gives them “back doors” into those computers. “This is the initial foothold. From there they can download arbitrary files and execute them on the system,” Baumgartner said.

An Oracle spokeswoman had no immediate comment.

The Dalai Lama, Tibet’s 78-year-old exiled spiritual leader, fled China to India in 1959 after an abortive uprising against Chinese rule.

Beijing considers the globetrotting monk and author a violent separatist and Chinese state media routinely vilify him. The Dalai Lama, who is based in India, says he is merely seeking greater autonomy for his Himalayan homeland.

 

 

U.S. Air Force to shut down ‘Space Fence’ surveillance system

August 12, 2013

WASHINGTON (Reuters) – The U.S. Air Force will shut down its space surveillance system that tracks satellites and other orbiting objects by October 1 due to budget constraints caused by automatic federal budget cuts known as the sequestration, it announced Monday.

Deactivating the system by October 1 would save the Air Force Space Command $14 million annually starting in fiscal year 2014.

The surveillance system got the nickname “Space Fence” because it transmits a “fence” of radar energy vertically into space that can detect any object or debris that crosses it without being cued to do so.

Commander of the Air Force Space Command, General William Shelton, said the system – which has been in operation since 1961 – was outmoded and that newer technology will provide more accurate observations.

Shelton said a new Space Fence is being planned now, which will provide more precise positional data on orbiting objects and would become the most accurate radar in the Air Force’s space surveillance network.

The system that will be discontinued on October 1 is a series of three transmitters and six receivers located across different points in the southern United States. It is operated by Five Rivers Services in Colorado.

“When combined with the new Joint Space Operations Center’s high performance computing environment, the new fence will truly represent a quantum leap forward in space situational awareness for the nation,” Shelton said in a statement Monday.

(Reporting by Valerie Volcovici; Editing by Lisa Shumaker)

 

How Congressional staff can keep their healthcare coverage

The Hill

By David Farber

08/13/13 02:30 PM ET

http://thehill.com/blogs/congress-blog/healthcare/316839-how-congressional-staff-can-keep-their-healthcare-coverage

 

On August 6, the federal Office of Personnel Management (OPM) released its proposed rule to fund 72-75 percent of the health care premium costs for Members of Congress and their staff. Because of the proposed rule, recent media reports have suggested, the Congressional staff health care crisis would be averted. However, the proposed rule and its preamble only address half the issue. In fact, there are two major questions that OPM needed to address – only one of which the Agency has signaled it understands. OPM and the White House have the opportunity to do right by Congressional staff in the upcoming rulemaking but to do so, they will need to read the Patient Protection and Affordable Care Act (PPACA) – all of it.

In fact, while both the OPM rulemaking and media reports have focused on premium payments, there are really two questions facing Congressional staff: (1) will staff be able to remain in their existing Federal Employee Health Benefit Plan (FEHBP) coverage, or will they be forced out of that coverage into an exchange plan; and (2) if staff must get exchange coverage, can the federal government actually pay exchange premiums, or will those premiums have to be paid by staff out of pocket?

The rulemaking addresses the second question in a way that will allow the government to pay staff premiums. But the more important question is whether staff even need to switch out of existing coverage into an exchange. Indeed, as demonstrated by IRS official Danny Werfel’s candid testimony before the House Ways and Means Committee on August 1, federal employees, including Hill staff, should not want to leave the robust FEHBP coverage they enjoy today for uncertain exchange coverage in the future.

What is the argument for Hill staff keeping FEHBP coverage? While section 1312(d)(3)(D)(i) of the Affordable Care Act suggests that Congressional staff may only have access to coverage “created under this Act” or through an “Exchange,” there is another provision in the same law – the grandfathering clause – that may be more relevant.

That provision, found in section 1251(a), states: “nothing in this Act (or an amendment made by this Act) shall be construed to require that an individual terminate coverage under a group health plan or health insurance coverage in which such individual was enrolled on the date of enactment of [the Affordable Care] Act.” In other words, if you are in FEHBP today, nothing in the Act, including section 1312(d)(3)(D), should be allowed to terminate your coverage come January 1, 2014. While it might be argued that the two clauses conflict, that is where an OPM rulemaking could do the right thing, and apply Agency discretion to have section 1251 trump section 1312 – at least for all staff who were on the payroll on the date of enactment, and, in OPM discretion, also to staff employed on December 31, 2013. Things may be a bit more ambiguous for staff hired next year, but that can be fixed later.

The above analysis is not new – the Congressional Research Service laid out the arguments in an April 2, 2010 Congressional Distribution Memorandum, on page 11. While CRS does a good job explaining the pros and cons, the roadmap to maintain coverage is more than there. OPM was aware of the CRS analysis and used it for other reasons. Yet, on the more significant question of whether Congressional staff can remain in their FEHBP coverage, OPM has failed to pick up the argument, much less run with it.It is not too late.

Every Congressional staffer who wants to stay in his or her FEHBP coverage should submit a comment to OPM and should urge the agency to apply the Affordable Care Act. Comments should urge OPM to read every section of the Act – not just section 1312, and to give meaning to section 1251 – the grandfather clause. The agency has the ability to fix this.

Where there is an agency will, there is a regulatory way. OPM has the chance to get this right, and not only answer who can pay premiums, but answer the more important question of whether Congressional staff can keep their FEHBP coverage. Every other American who has enjoyed employer-based coverage will get to keep their existing plan come January 1. If Congress intended to subject itself and staff to the same rules as every other American, then Congressional staff should be able to retain their existing FEHBP coverage as well. The agency has clear authority to make this happen. For all those staff who serve above and beyond, OPM should give them the right answer.

Read more: http://thehill.com/blogs/congress-blog/healthcare/316839-how-congressional-staff-can-keep-their-healthcare-coverage#ixzz2bxDNzpo2

 

What if everyone plugs in their cars at once?

One day, when electric cars rule the road, owners might crash the power grid if they all were to plug their cars in at once. A smart charger developed by Northwest scientists would prevent that calamity.

Seattle Times

Originally published Saturday, August 10, 2013 at 8:07 PM

By Sandi Doughton

Seattle Times science reporter

http://seattletimes.com/html/localnews/2021586394_electriccarxml.html

 

RICHLAND — Electric cars account for fewer than 0.05 percent of passenger vehicles in the United States today, but Michael Kintner-Meyer envisions a future where plug-ins rule the roads.

The proliferation of electric cars will bring benefits — like lower tailpipe emissions — ­but could also create unique headaches, says Kintner-Mayer, who leads a project at Pacific Northwest National Laboratory to improve the vehicles and tackle the problems.

Now, he and his colleagues have crafted a solution to the scenario that gives power-grid operators nightmares: The prospect that millions of Americans will get home from work and plug in their cars at the same time.

“It would create havoc,” said Kintner-Meyer. “You could have the lights go out. You could have rolling brownouts.”

The way to avoid widespread overloads is to spread out the demand — which is what PNNL’s “grid-friendly” charger does. The device, which is about the size of a shoebox, monitors the status of the grid and adjusts accordingly, switching off when demand is high and switching on when power is plentiful.

Adaptive charging could lower car owners’ electricity bills by allowing them to draw power when rates are lowest. And if enough cars use the systems, they could also collectively provide a valuable service to the power grid by dampening swings in electrical generation from the growing number of wind farms and solar arrays.

Drivers could save up to $150 a year, the grid would be protected from crashing, and the overall power system would run more smoothly, Kintner-Meyer said.

 

California-based AeroVironment, Inc. licensed the technology from PNNL and is integrating it into beta versions of a charging station. Alec Brooks, the company’s chief technology officer for efficient energy systems, has been using one to charge his Nissan Leaf.

At PNNL, Kintner-Mayer runs the system on a 2009 Prius hybrid that he and his team converted to a plug-in.

Because the grid-friendly system switches off and on, it takes longer than a conventional charging system, Brooks said. But it doesn’t matter — as long as the car is fully charged in the morning.

“There’s usually plenty of slack time,” he said. “I can’t think of the last time I was waiting for my vehicle to charge.”

Owners in a hurry can simply bypass the grid-friendly feature.

With fewer than 150,000 electric cars in the nation, power-grid operators don’t have much to worry about yet. But plug-in cars are the fastest-growing sector of the automotive industry, according to the advocacy group Plug In America.

The Green Car Report estimates 2013 sales will approach 100,000, nearly double last year’s total.

The Northwest is helping lead the charge, with 5,400 electric cars in Washington and about 3,000 in Oregon. The nation’s highest-selling Leaf dealership is in Bellevue, said Redmond resident Chad Schwitters, an electric-car enthusiast and vice president of Plug in America.

AeroVironment operates what’s called the West Coast Electric Highway — a network of fast-charging stations that will soon number 55, situated along I-5 and other highways in Washington and Oregon.

The Northwest is also ahead of the rest of the nation in experiencing growing pains caused by the proliferation of wind farms. In 2011, the Bonneville Power Administration ordered some turbines to shut down because the region had more power than it knew what to do with.

Now, grid operators usually compensate for swings in power by adjusting the spill at hydropower dams or having power plants increase or decrease their power output, Brooks explained. “Power plants get paid to do that.”

But if large numbers of electric cars were hooked into grid-friendly chargers, they could soak up excess electricity, then power down when the grid is running low, he explained.

There’s no billing arrangement yet that would allow electric-car owners to be compensated for helping even out the grid. But Brooks predicts that could change, which would provide a powerful incentive to switch to grid-friendly charging stations.

“I think it can be a very large market once there’s a recognition that you’re providing a useful service to the grid,” he said.

 

 

Could Elon Musk’s Hyperloop Derail High-Speed Rail?

Two years of background work by the Tesla Motors and SpaceX founder reveals an affordable, open-source alternative to California’s proposed high-speed rail.

BY COLIN WOOD AUGUST 13, 20130

http://www.govtech.com/state/Could-Elon-Musks-Hyperloop-Derail-High-Speed-Rail.html

Getting from California to China could become a trip that takes less than two hours within Elon Musk’s lifetime. On August 12, Musk, founder of SpaceX and co-founder of PayPal and Tesla Motors, revealed his plan for a new transport system that is far cheaper and faster than high-speed rail, outlining tentative details for a possible San Francisco to Los Angeles route that reaches top speeds of 760 mph. Musk has previously described Hyperloop as a cross between a “Concorde, a railgun, and an air hockey table.”

Musk released a 57-page document outlining an alpha-stage plan for Hyperloop. The South African entrepreneur also fielded questions from reporters around the world on his vision. The document contains pictures of what the pneumatic-tube system and vehicles might look like, possible approaches to various engineering problems, an itemized budget for a California route totaling just $6 billion, maps of proposed routes and network expansions, and technical diagrams and explanations of various technologies that would be integrated into the system.


 A Hyperloop connecting San Francisco and Los Angeles could be constructed within seven to 10 years for $6 billion, Musk said, adding that California’s proposed $70 billion high-speed rail system is a mistake. Musk’s financial model would price a one-way ticket from San Francisco to Los Angeles at about $20 per person. The trip would last just 35 minutes.

Musk explained that Hyperloop is a “low priority” for him now as he is busy with SpaceX and Tesla Motors, and that is partially why the project relies on an open-source model. The project is also open-source, Musk said, to attract interest and leverage ideas from as many people as possible. During the press conference, he repeatedly emphasized the importance of someone developing a prototype of the technology to work out engineering problems — a task he said he will probably take on himself.

While the document released shares many ideas developed by SpaceX and Tesla Motors engineers, nothing is set in stone yet, Musk said, and that’s why it’s an alpha. Government Technology participated in Musk’s telephone media question and answer session as Musk explained the project as it now stands. Here is an abridged transcript of the call:

What will the experience of being on Hyperloop feel like?

It would actually feel a lot like being in an aeroplane. There would be initial acceleration, and once you’re traveling at speed, you wouldn’t really notice the speed at all. It would be extremely smooth, like you’re riding on a cushion of air. You wouldn’t feel lateral acceleration because the pod would bank against the side of the tube, so the g-force would always point down. With a maximum g-force of around a half a G of increase, so that’s also comparable to what you would see on an aeroplane and far less of what you would see on, say, a rollercoaster. It should just feel real super smooth and quiet. And obviously there’d never be any turbulence or anything.

I heard you say this could never crash. How could that be in earthquake country?

Well, obviously “never” is a very strong word. It’s extremely difficult, I suppose. Unlike an aeroplane, it’s not moving in three dimensions. It’s not going to fall out of the sky, nor can it really be derailed as a train can. The thought I had was in the pylons upon which the tube is mounted to have earthquake dampeners sort of similar to those sort of things you have in buildings in California. They’re like basically shock absorbers and they have two laterally mounted and one vertically mounted in the post. Now, there’s going to be some earthquakes that are gigantic that can overcome the dampeners, but then we have that same problem in buildings, so if LA falls down, I guess Hyperloop will, too. But relative to say, a train, it should be quite a bit safer.

What is the likelihood of this actually being built?

I’ve been thinking about that and I’m somewhat tempted to make at least a demonstration prototype and I’ve come around a bit on my thinking here to create a sub-scale version that’s operating and then hand it over to somebody else. I think that some of the more difficult things is just ironing out the details at a sub-scale level. I think I’ll probably end up doing that. It just won’t be immediate because in the short term I’m focused on SpaceX and Tesla.

If somebody else goes and does a demo, that would be really awesome. And I hope somebody does, but if it doesn’t look like that’s happening or it looks like that’s not happening in the right way, then I would. I don’t really care much one way or the other if I have any economic action here, but it would be cool to see a new form of transport happen.

While planning this project with SpaceX and Tesla engineers, did you talk a lot about power consumption?

Quite a fundamental question is, ‘Can you contain enough energy in a battery pack in a pod to pump the air from front to rear?’ And we can. In fact, if we just use some version of the [Tesla] Model S motor, maybe a few of them in series, and the Model S battery back, assuming today’s current technology, we can make it work.

[A more technical outline of Hyperloop’s power system can be found in the Hyperloop Alpha pdf.]

There has been talk of using solar power for Hyperloop as sustainability is one of its core features. Is this a viable option?

There’s actually way more surface area on the top of the tube than you really need. If you did actually put solar panels on the whole thing, you would have to dump the power somewhere, because you would have more than you can consume.

Why can Hyperloop be supported by pylons whereas high-speed rail requires a much stronger foundation?

It’s a weight thing. This was designed to be super-light and trains are just amazingly heavy. They don’t try very hard to make trains light. Yeah, [laughing] trains are heavy. This is designed more like an aircraft.

How many people and for how long did they work on coming up with this project?

There were probably in total a little over a dozen people working on it, but it was very much a background task. This was not something that was anybody’s full-time job. I started thinking about it maybe two years ago and then started involving others about ten months ago. And we’ve just been batting it around in the background, and in the last basically few weeks we did allocate some full-time days to it.

What do you think of California’s proposed high-speed rail project?

Um, I don’t think we should do the high-speed rail thing, because it’s currently slated to be roughly $70 billion but if one ratio is the cost at approval time versus the cost at completion time… you know most large projects escalate quickly… I think it’s going to be north of $100 billion. And then it seems it’s going to be less desirable to take that than take a plane, so California taxpayers aren’t just going to have to pay $100 billion, they’re also going to have to maintain and subsidize the ongoing operation of this train for a super long time as kind of California’s AmTrak. That just doesn’t seem wise for a state that was facing bankruptcy not that long ago.

 

 

Microsoft’s Windows 8.1 update to launch in October

Nancy Blair, USA TODAY 1:16 p.m. EDT August 14, 2013

 

Microsoft today said it will start rolling its much-anticipated Windows 8.1 update in October.

Windows 8.1 will start rolling out as a free update worldwide on Oct. 17 at 7 a.m. ET (or midnight on Oct. 18 in New Zealand, Microsoft notes in a blog post).

Consumers can get the update through the Windows Store. It will also be available at retailers and on new devices on Oct. 18.

Microsoft has taken some critical knocks for the Windows 8 operating system, which it set out to build to work equally well across traditional computers as well as touchscreen devices. Sales have been less than stellar for machines running Windows 8, which represented a dramatic overhaul of the Windows OS.

Windows 8.1 has been available as a preview since earlier this summer. Among other things, it brings the ability to boot up your PC in the traditional-looking desktop rather than the colorful tiles that are a hallmark of the new OS.

 

XP’s retirement will be hacker heaven

Hackers will bank bugs until after Microsoft retires Windows XP in April 2014; expect attacks, say security experts

Gregg Keizer

August 12, 2013 (Computerworld)

 

Cyber criminals will bank their Windows XP zero-day vulnerabilities until after Microsoft stops patching the aged operating system next April, a security expert argued today.

Jason Fossen, a trainer for SANS since 1998 and an expert on Microsoft security, said it’s simply economics at work.

“The average price on the black market for a Windows XP exploit is $50,000 to $150,000, a relatively low price that reflects Microsoft’s response,” said Fossen. When a new vulnerability — dubbed a “zero-day” — is spotted in the wild, Microsoft investigates, pulls together a patch and releases it to XP users.

If the bug is critical and being widely used by hackers, Microsoft will go “out-of-cycle,” meaning it will issue a security update outside its usual monthly Patch Tuesday schedule.

But after April 8, 2014, Microsoft has said it will retire Windows XP and stop serving security updates. The only exceptions: Companies and other organizations, such as government agencies, that pay exorbitant fees for custom support, which provides critical security updates for an operating system that’s officially been declared dead.

Because Microsoft will stop patching XP, hackers will hold zero-days they uncover between now and April, then sell them to criminals or loose them themselves on unprotected PCs after the deadline.

“When someone discovers a very reliable, remotely executable XP vulnerability, and publishes it today, Microsoft will patch it in a few weeks,” said Fossen. “But if they sit on a vulnerability, the price for it could very well double.”

Minus any official patching from Microsoft, XP zero-days and their associated exploits could remain effective for months, maybe even years, depending on how well security software detects and quarantines such attacks.

If Fossen’s thesis is correct, there should be signs of bug banking, most notably a sharp reduction in the number of publicly-disclosed or used-in-the-wild XP vulnerabilities during the fourth quarter of 2013 and the first quarter of 2014.

“[Hackers] will be motivated to sit on them,” Fossen stressed.

There really aren’t precedents to back up Fossen’s speculation, he acknowledged, because the last time Microsoft pulled the plug on an edition was July 2010, when it retired Windows 2000. But according to metrics firm Net Applications, at the time Windows 2000 powered just four-tenths of one percent of all PCs.

Windows XP will have a much larger share when it’s retired next year: Based on XP’s current rate of decline, Computerworld has projected that the old OS will still run between 33% and 34% of the world’s personal computers at the end of April 2014.

 

That would be 80 times the share of Windows 2000 when it retired.

But even with Windows 2000’s minuscule share when it left support, there were reports that an edition-specific zero-day was created and sold.

“I heard rumors of a new zero-day being found and sold after the support period expired [for Windows 2000],” said HD Moore, creator of the popular Metasploit penetration testing toolkit and the chief security officer of security company Rapid7. “But there were few if any examples that ended up in the public eye.”

 

Moore agreed with Fossen that XP bugs would be more valuable after April 2014, but contended that all Windows vulnerabilities would jump in value.

“Something more common [three years ago] was backporting new security advisories into functional exploits on Windows 2000,” said Moore in an email. “Every time a server-side vulnerability was found in Windows XP or 2003 Server, quite a few folks looked at whether this would also work against Windows 2000. My guess is that the retirement of Windows XP will result in all Windows vulnerabilities being of slightly higher value, especially given the difference in exploit mitigations between XP and newer platforms.”

It’s far easier to exploit flaws in Windows XP than in newer editions, such as Windows 7 and Windows 8, noted Moore, because of the additional security measures that Microsoft’s baked into the newer operating systems.

Microsoft has said the same. In the second half of 2012, XP’s infection rate was 11.3 machines per 1,000 scanned by the company’s security software, more than double the 4.5 per 1,000 for Windows 7 SP1 32-bit and triple the 3.3 per 1,000 for Windows 7 SP1 64-bit.

“Windows XP vulnerabilities will be valuable as long as enterprises utilize that version of the operating system,” said Brian Gorenc, manager of HP Security Research’s Zero Day Initiative, the preeminent bug bounty program. But Gorenc also argued that any XP zero-days would be outweighed by higher-priority hacker work.

“Researchers are primarily focused on the critical applications being deployed on top of the operating system,” said Gorenc in an email reply to questions today. “Attackers and exploit kit authors seem to rely on the fact that the update process and tempo for applications are not as well defined as those for operating systems.”

Fossen, convinced that XP would be a big fat target after April 8, wondered whether Microsoft might find itself in a tough spot, and back away from the line in the sand it’s drawn for XP’s retirement.

“If hackers sit on zero-days, then after April use several of them in a short time, that could create a pain threshold [so severe] that people organize and demand patches,” said Fossen.

The consensus among analysts and security experts is that Microsoft will not back down from its decision to retire XP, come hell or high water, because it would not only set an unwelcome precedent but also remove any leverage the company and its partners have in convincing laggards to upgrade to a newer edition of Windows.

 

But a few have held out hope.

“Suppose we get to a date post the end of Extended support, and a security problem with XP suddenly causes massive problems on the Internet, such as a massive [denial-of-service] problem?” asked Michael Cherry, an analyst with Directions on Microsoft, in an interview last December. “It is not just harming Windows XP users, it is bringing the entire Internet to its knees. At this time, there are still significant numbers of Windows XP in use, and the problem is definitely due to a problem in Windows XP. In this scenario, I believe Microsoft would have to do the right thing and issue a fix.”

Jason Miller, manager of research and development at VMware, had some of the same thoughts at the time. “What if XP turns out to be a huge virus hotbed after support ends? It would be a major blow to Microsoft’s security image,” Miller said.

 

Another option for Microsoft, said Fossen, would be to take advantage of a post-retirement disaster to do what it’s been doing for years, push customers to upgrade.

“They might also respond with a temporary deal on an upgrade to Windows 8,” said Fossen, by discounting the current $120 price for Windows 8 or the $200 for Windows 8 Pro. “Then they could say, ‘We’re aware of these vulnerabilities, but you should upgrade.'”

 

China’s Emerging C4ISR Revolution

http://thediplomat.com/2013/08/13/chinas-emerging-c4isr-revolution/?all=true

August 13, 2013

By Shane Bilsborough, ATIC

 

China’s military modernization has given rise to an enormous Western literature dissecting its scope and progress. Despite this boom, many analysts have paid relatively little attention to recent advances in the People’s Liberation Army’s (PLA) command, control, communication, computer, intelligence, surveillance, and reconnaissance (C4ISR) capabilities.

The PLA’s growing complement of manned and unmanned aircraft, reconnaissance satellites, and sophisticated ground-based infrastructure comprises the operational foundation of China’s emerging network-centric military. It is also the means by which better-known systems, such as the DF-21D “carrier-killer” anti-ship ballistic missile or the J-20 stealth fighter, could actually fulfill their intended roles during a major regional contingency.

From recent developments in China’s C4ISR infrastructure, it is clear that PLA is well on its way to becoming a sophisticated global military possessing many of the same C4ISR capabilities enjoyed by U.S. forces although it remains to be seen whether organizational barriers will short-circuit this trend.

 

Airborne C4ISR

Much if not most Chinese thinking on C4ISR and military modernization stems from analysis of the United States’ military performance in recent conflicts. For example, learning from the United States’ successful employment of specialized flying C4ISR systems, such as the E-3 Sentry, and the J-8 STARS, the PLA has identified Airborne Early Warning Command and Control (AEWC&C) aircraft as central to waging war against intervening naval and air forces. According to multiple Chinese analyses, a single airborne AEWC&C aircraft is the operational equivalent of roughly ten ground-based systems of comparable sophistication. In addition to facilitating real-time intelligence gathering, border surveillance, and command and control, these systems are expected to make PLAAF and PLAN fighter aircraft less susceptible to detection by affording them enhanced situational awareness without using their own radar systems. Historically, this capability has afforded the U.S. Air Force significant advantages in beyond visual range engagements that may now be lost.

In keeping with the Chinese analyses of their significance, the PLAAF is already fielding advanced systems of this type. The PLAAF’s current top-of-the-line AEWC&C system, the KJ-2000, is believed to be one full generation ahead of U.S. E-3 AWACS and E-2 Hawkeye aircraft. Among other advancements, the KJ-2000 boasts an indigenously produced phased array radar capable of tracking sixty to one hundred aerial targets simultaneously at a distance of up to four hundred and seventy kilometers. Although somewhat less technologically sophisticated, the PLAN’s Y-8J AEW system affords China’s naval air forces a similar upgrade in situational awareness and is reportedly capable of detecting objects as miniscule as a submarine periscope within its effective range of up to one-hundred eighty-five kilometers.

The United States’s unmanned C4ISR capabilities are also being replicated by the PLA. While information beyond mock-ups displayed at China’s annual Zhuhai airshow is sparse, recent disclosures by Chinese official sources suggest unmanned aerial vehicles (UAVs) will play a major role in China’s emerging C4ISR architecture. According to a PLA statement posted online in July 2011, a ground operator controlled a UAV called the Silver Eagle that participated in South China Sea naval exercises. The UAV reportedly disrupted communications and responded to red team countermeasures while acting as a node for a PLA communications network.

Other modern Chinese UAV’s, such as the Guizhou Aircraft Industry Corporation’s Xianlong long-range UAV and Beijing University of Aeronautics and Astronautics’ BZK-005 UAV are believed to be capable of loitering over a combat zone for roughly forty hours, much like the U.S. Global Hawk. The Chengdu aircraft Design Institute also appears to be developing its own indigenous Global Hawk, the Long Haul Eagle, which was first revealed in 2008. These systems will greatly enhance the PLA’s Broad Area Maritime Surveillance (BAMS) while adding new capabilities.

 

Space-based C4ISR

China has made still greater strides in its space program and is emerging as a leading space power. Senior PLA and Chinese Communist Party (CCP) leaders have identified space technology as a national priority and allocated significant resources to improving China’s space-related research, development, and launch infrastructure. As part of the PLA’s integrated civil-military space program, counter-space technologies and systems have been a parallel area of focus following China’s landmark 2007 anti-satellite test.

Recent years have seen a number of major advancements in China’s C4ISR related space development programs. The Beidou-2 satellite series, China’s indigenous GPS alternative, has already achieved full regional coverage and is on schedule to achieve global coverage by 2020. With at least 13 successful launches since April 2006, the Yaogan series of electro-optical, synthetic aperture radar, and electronic intelligence satellites have also proven a major success. Apart from these and other satellite programs, such as the somewhat more mysterious Shijian series, China has also successfully modernized and expanded its space launch infrastructure. Under the executive authority of its General Armaments Department, the PLA and its civilian partners now operate three satellite launch centers at Taiyun, Xichang, and Jiuquan, with a fourth large complex under construction at Wenchang on Hainan island.

China’s great leap forward in space and airborne C4ISR capabilities has already impacted the Asia-Pacific military balance. If current trends in technological development, procurement, and satellite launch capacity hold, the next 15-20 years will see the PLA benefit from vastly improved geolocation and precision strike capabilities, persistent global satellite surveillance, and a survivable military communications and data-link architecture. Concurrent improvements in counter-space capabilities will also put U.S. and allied space, air, and sea-based assets at risk, seriously complicating air and naval access to the region.

 

Organizational Issues

Despite its ongoing technological transformation, it should be noted that the PLA still faces serious obstacles it must overcome before it can take full advantage of its modern C4ISR systems and capabilities. Of these challenges, cultural and organizational problems have proven particularly stubborn.

 

Many PLA units have proven reluctant to adopt cutting-edge communication and ISR systems due to endemic interoperability problems and lack of experience with modern military technology. Communication and information sharing problems continue to arise in part due to a growing technological mismatch between mainline PLA units, which still employ outdated equipment, and their far less numerous but more sophisticated counterparts. On a broader level, the PLA’s constituent services, and even operational units within the same service, use different and incompatible models and generations of equipment that severely diminish their overall military effectiveness.

In part due to deep-seated inter-service rivalry, PLA joint training still leaves much to be desired. Often, exercises are only joint for certain segments rather than their entirety. Worse, those joint training efforts that do take place are often rudimentary or unrealistic. Consider that one Chinese article praises a joint exercise in which Navy units practiced ship loading and unloading while ground forces practiced loading aircraft onto railcars for the PLAAF. Another document touts the “jointness” of an exercise in which top service leaders communicated via teleconferencing. In addition, as evidenced in these and other exercise reports, no standard metric exists for evaluating joint performance either in C4ISR or other military spheres.

 

Conclusion

Without breaking down the technological and organizational barriers between its constituent services, the PLA will not be able to implement the “system-of-systems” approach to anti-access operations and C4ISR that its leadership envisions. The PLA has taken some halting actions towards promoting joint operations and information sharing, such as the relatively decisive step of appointing Xu Qiliang, a PLAAF general, to command the deployed force during joint exercises in 2007. This was possibly the first time a PLAAF general was given such a high-profile command, although it hardly constitutes a breakthrough.

Assuming its Central Military Commission successfully limits the tendency of the ground forces to assert control over military doctrine and planning during the next ten to fifteen years, China’s leadership will find that the PLA already possesses most if not all of the C4ISR systems and integrative technologies necessary to complete the PLA’s transformation into a 21st century force.

 

Shane Bilsborough is an intelligence analyst at the Advanced Technical Intelligence Center (ATIC).

 

 

Springfield UAS center names director

Ohio and Indiana efforts will be coordinated by former Dayton Development Coalition official.

Dayton Daily News

Posted: 4:30 p.m. Wednesday, Aug. 14, 2013

By Andrew McGinn

Staff Writer

SPRINGFIELD —

 

The new Springfield-based office that will act as the official hub of unmanned aerial systems testing in Ohio and Indiana has named its first director.

 

Dick Honneywell, a retired Air Force Reserve colonel, will lead the Ohio/Indiana UAS Center and Test Complex at Springfield’s Nextedge Applied Research and Technology Park, Ohio Gov. John Kasich announced this week.

Honneywell, who most recently served as vice president of aerospace at the Dayton Development Coalition, will oversee the center that’s vying to become one of six national test centers for unmanned aircraft that the Federal Aviation Administration is expected to name later this year.

The FAA will use those six centers to study how best to integrate so-called drones into manned airspace.

An industry report earlier this year predicted that Ohio stands to gain more than 2,700 new jobs by 2025 in the blossoming UAS industry. Drones also are expected to one day play a big role in precision agriculture.

The Ohio Department of Transportation on July 1 began leasing 2,060 square feet of office space for the center at Nextedge, located along U.S. 40 just east of Springfield. The two-year lease will cost a total of $70,000.

It’s hoped the center led by Honneywell will act as a magnet for new federal, military and commercial research and testing of UAS.

“Dick brings a tremendous combination of technical and management expertise, as well as a broad knowledge of aerospace systems from his work in the Air Force and private sector,” Kasich said in a statement.

Ohio and Indiana have pooled their efforts to become an FAA test site, with Ohio offering such research powerhouses as the Air Force Research Laboratory at Wright-Patterson Air Force Base and NASA’s Glenn Research Center in Cleveland.

Indiana, on the other hand, has valuable airspace, including the 55,265-acre Jefferson Proving Ground in the southeast part of the state, where from 1941 to 1995, the Army tested 25 million rounds of munitions.

Springfield was deemed a central location to manage the combined effort.

 

DHS Awards $6 Billion Cybersecurity Contract To 17 Vendors

By: Judi Hasson

08/14/2013 ( 8:00am)

http://www.hstoday.us/industry-news/general/single-article/dhs-awards-6-billion-cybersecurity-contract-to-17-vendors/df76d41031a95bb4f3c67f3adbb084d2.html

 

The Department of Homeland Security on Monday awarded a $6 billion contract to 17 companies to protect the government against cybersecurity threats.

The contractors included big names such as IBM, Lockheed Martin Corporation, General Dynamics Corporation, Computer Sciences Corporation and Northrup Grumman Corporation and others. DHS tapped the companies for the Continuous Diagnostics and Mitigation (CDM) Program.

The CDM Program will provide specialized information technology tools and services to combat cyber threats in the civilian “.gov” networks and various network endpoints, including mobile devices. The CDM program, also known as Continuous Monitoring as a Service (CMaaS), shifts the government’s cybersecurity posture from a compliance reporting model to a real-time approach to combating threats, according to the General Services Administration (GSA).

Deputy Under Secretary for the National Protection and Programs Directorate (NPPD) Suzanne Spaulding said in a statement released Aug. 13 that federal departments and agencies will be able to enhance their cybersecurity assessments by implementing automated network sensor capacity and prioritizing risk alerts.

“Results will feed into agency-level dashboards that produce customized reports that alert information technology managers to the most critical cyber risks, enabling them to readily identify which network security issues to address first, thus enhancing the overall security posture of agency networks,” Spaulding said. “Summary information from participating agencies will feed into a central Federal-level dashboard, managed by DHS’ National Cybersecurity Communication and Integration Center, to inform and prioritize cyber risk assessments across the Federal enterprise and support common operational pictures that provide cybersecurity situational awareness to our stakeholders,” she said.

“This significant contract award is designed to support federal civilian networks and the extensive number of cybersecurity requirements for any federal custom and cloud application over the life of the contract, and will be funded through each participating department and agency, not solely by DHS,” Spaulding said.

“The CDM program is a huge step forward for government security,” said Kenneth Kartsen, vice president and head of federal business at McAfee. “The necessary but limited and largely manual check-the-box approach of FISMA (Federal Information Security Management Act) was like looking through a rear-view mirror. By contrast, the CDM program illustrates the real progress DHS and the government are making in cyber security.”

DHS CDM Program PhasesThe first phase of the $6 billion program focuses on four functional capabilities: management of hardware and software assets, configuration management, and vulnerability, “which are baseline capabilities to protect data,” according to a description of the program by the US Computer Emergency Readiness Team (US-CERT).

So far, DHS has committed $185 million to kick-start the program. GSA will manage the contract on behalf of DHS and will charge agencies a 2 percent fee for using the blanket purchase agreement.

“Our success with DHS demonstrates that IBM’s $4B annual investment in cyber security and security analytics research puts IBM in a unique position to help government agencies meet evolving cybersecurity threats,” said Anne Altman, General Manger of IBM’s US Federal business.

“IBM will draw from decades of experience working with federal agencies and worldwide clients and our own internal experience in securing the worldwide networks used by our 400,000 plus employees.”

A spokesman for Lockheed Martin said the company will draw upon its years of investment in continuous monitoring technology and partnerships with members of the Lockheed Martin Cyber Security Alliance for this service.

“Beginning in 2009, Lockheed Martin leveraged advances within its NexGen Cyber Innovation and Technology Center Lab and the Lockheed Martin Cyber Security Alliance to implement a comprehensive solution,” the company said in a statement.

 

The winning companies are:

Booz Allen Hamilton

CGI

CSC

DMI

DRC

GDIT

HP

IBM

KCG

Kratos

Lockheed Martin

ManTech

MicroTech

Northrop Grumman

SAIC

SRA

Technica

 

Why Feds Are Embracing Agile

http://www.informationweek.com/government/enterprise-applications/why-feds-are-embracing-agile/240159725

Roger Baker     | August 14, 2013 09:06 AM

Numerous federal agencies are moving to Agile software development methods for some or all of their IT systems development projects. In an era of tightening federal budgets and increased demand on technology to help meet mission requirements, agencies are searching for ways to deliver critical mission functionality more quickly and with less risk. For a number of agencies, Agile has become the answer.

On its face, the case for Agile is straightforward: Break the software development process into a series of short “sprints,” each of which delivers on a small portion of the requirements of a system. This modular approach enables (and encourages) frequent delivery of new functionality to end users, and facilitates (even demands) user participation and feedback during system creation. In contrast, the “Waterfall” development approach used traditionally within government requires users to be able to fully describe what they want in a system up front and to wait years until the system is finished.

Agencies typically adopt Agile to avoid large-scale failures in systems development programs. The Department of Veterans Affairs (VA), an early adopter of Agile in the federal government, moved to Agile in 2009 for a critical new system (the New GI Bill) when the department was failing on much of the rest of its development portfolio. As a result, VA successfully delivered its first new large-scale system in years, and decided to adopt Agile for the development of a number of other critical systems.

Agencies are also moving to Agile to better ensure that the system being developed actually meets the needs of the mission. Programs using Agile development provide customers with early production versions of the product to use and critique, ensuring customer involvement and buy-in. More importantly, because change happens, Agile’s frequent releases provide the ability to rapidly respond to changing mission priorities, customer preferences, or even requirements imposed by new laws.

Critical to today’s federal environment, Agile also cuts system development costs. Frankly, this can be the hardest to justify. The initial estimates for the cost to develop a system using either Waterfall or Agile are likely to be the same. Logically, if both processes work as well in practice as they do in theory, either process should result in the same system for much the same price. In reality, metrics show that incremental programs (including Agile) successfully meet their delivery commitments at a rate nearly three times that of Waterfall. In my experience, this equated to on-time delivery jumping from under 30% to over 80% for a $1 billion systems development portfolio.

Using Agile for systems development frequently has an immediate positive impact on mission results. By delivering and then improving production versions of a system early in the development cycle, Agile programs allow the agency to begin realizing the benefits of the new system to their missions much earlier. And with system users intimately and continually involved in its design and development, the end solution better addresses their real-world requirements, allowing them to work more productively.

Finally, using Agile can help improve the position of the CIO and the IT organization in the agency. With daily active engagement between users and IT, and frequent on-time delivery of new, mission-prioritized system functionality, customers start to see IT as a full, essential and productive partner in accomplishing the agency’s mission. And that has substantial implications during the budget process, during resource discussions, and on the agency’s willingness to give more authorities to the CIO.

After all, IT is an investment in improved mission effectiveness. If they see that investment returning frequent, reliable, positive results, they’re going to look to find more ways to invest.

Roger Baker is chief strategy officer for Agilex, a leading provider of mission and technology solutions to the federal government. He was previously CIO for the Department of Veterans Affairs from 2009-13 and served as CIO for the Department of Commerce from 1998-2001.

 

What the NSA’s Massive Org Chart (Probably) Looks Like

http://www.defenseone.com/ideas/2013/08/what-nsas-massive-org-chart-probably-looks/68642/

By Marc Ambinder

August 14, 2013

Want to understand how an organism really works?  Take a look at its plumbing. Figure out where the pipes fit together. That’s the approach I take to national security and that’s the spirit behind this look at the structure of one of the most important institutions in U.S. intelligence: the National Security Agency.

Some intelligence organizations, such as the National Reconnaissance Office and the National Geospatial Intelligence Agency, have declassified most of their organizational charts. The NRO develops, launches and controls spy satellites; the NGA analyzes and distribute imagery. For these agencies, the plumbing matters less than what flows through the pipes, which is highly classified. 

But the NSA, with its triple mission — break codes, secure data, collect signals intelligence — has not made its structure public.  Even by the standards of U.S. intelligence agencies whose existence was declassified much later, the NSA’s organization chart is largely impermeable to outsiders. The best of its chroniclers, like Jeff Richelson, James Bamford, Bill Arkin and Matthew Aid, have managed to collect bits and pieces of open source data, but many senior intelligence officials who don’t work for NSA still have only a vague idea of what signals intelligence collection entails, and even fewer understand the NSA bureaucracy. The map to the NSA’s inner sanctum is generally given only to a select few members of Congress and their staff.

In the interests of transparency and in an effort to establish a basis for continued public exploration of the world of intelligence, I’ve cobbled together a rough and incomplete but still rather comprehensive organizational chart of the agency’s operational, analytical, research and technology directorates. With only a few exceptions, the information does not come from inside sources. It builds on the work of the researchers mentioned above and it represents the culmination of a lot of time spent cross-checking government documents and LinkedIn profiles,job postings and agency announcements.

The NSA prefers not to discuss how it works and what it does. Defense One offered NSA the opportunity to review the organization chart and address any national security concerns. “We won’t fact check a chart that appears to be based largely on speculation,” the agency replied through a spokesperson.

Undoubtedly, some of what follows is incomplete and I hope knowledgeable readers will set me straight. 

It has five operational directorates, several administrative directorates and three large operational centers.  Each is headed by an associate director, and each associate director has a technical director. They report to the executive director, who reports to the deputy director, who reports to the DIRNSA, which is NSA-speak for Director of the NSA, Gen. Keith Alexander. He’s also the commander of the Defense Department’s U.S. Cyber Command and the Central Security Service, the military signals and cyber intelligence units that contribute personnel to the NSA.  The CSS is essentially the NSA.

The NSA’s Foreign Affairs Directorate interacts with foreign intelligence services, counterintelligence centers and the UK/USA and FIVE EYES exchanges, an alliance of intelligence operations between the U.S., UK, Canada, Australia and New Zealand that dates back to 1946. It also includes the Office of Export Control Policy. 

The Information Assurance Directorate is the center of NSA’s cyber warfare and defense program offices.  It’s also responsible for generating the codes that the U.S. uses.

The Signals Intelligence Directorate is the largest functional directorate. It has three subdivisions. One helps determine the requirements of what the NSA calls its customers — other agencies, the president, the military.  The agency’s main analytical centers live here, too.  The super-secret work of SIGINT collecting and offensive cyber warfare is the responsibility of S3, with its many bland sounding and compartmentalized branches.

The Research Directorate figures out how to break codes and how to best penetrate the telecom infrastructure of tomorrow. The Technical Directorate puts everything together. It’s responsible for the infrastructure for everything NSA does.

Two other directorates are responsible for training and human resources and for acquisition and procurement.

The NSA’s three operational centers are its main watch facility, the National Security Operations Center, or NSOC; the National Threat Operations Center, which is the U.S. government’s primary worldwide cybersecurity warning office; and the NSA/CSS Commercial Solutions center, which interacts with private companies, uses commercial technologies for classified purposes and conducts research on cryptography that the government is willing to share with the public.

Another NSA office is the Special Collection Service, which is run jointly with the CIA and operates classified listening posts from embassies and other special facilities worldwide. The SCS is responsible for NSA listening posts that aren’t inside of U.S. or allied military facilities.

Inside the United States, the NSA has very large off-site campuses in Hawaii, Texas, Utah and Georgia.  In Maryland, it owns and hosts offices in Linthicum, Finksberg, Bowie and College Park, alongside Ft. Meade, its home, and adjacent properties.  There’s an NSA office inside the Cheyenne Mountain Operations Center, where NORAD and NORTHCOM have their backup command center. And NSA has a big presence at Site R, the site of the Alternate National Military Command Center, near Ft. Ritchie, Md.

[Related: The NSA’s New Spy Facilities are 7 Times Bigger Than the Pentagon]

All these sites are connected by an architecture called NSANet, which exists in parallel to the regular telephone switch system. Real-time feeds of SIGINT reports and time-sensitive cyber information can be sent to users anywhere in the world, such as those on Navy ships, using the NSA’s Integrated Broadcast Service.

The NSA uses a bewildering amount of technical tools and databases. You’ve now heard of PRISM, which was revealed in the Edward Snowden leaks. This system collects digital network information from U.S. content providers. 

The NSA also has several tools and databases, including metadata collection, a repository of malicious network signatures and an Air Force/Navy tool that tracks ships in real time.

Here are the other main NSA collection tools and databases:

ONEROOF: Main tactical SIGINT database  (Afghanistan), consisting of raw and unfiltered intercepts

NUCLEON:  Global telephone content database

XKEYSCORE: Collection tool for international metadata

AIRGAP: Priority missions tool used to determine SIGINT gaps

HOMEBASE: Tactical tasking tool for digital network identification

SNORT: Repository of computer network attack techniques/coding

WIRESHARK: Repository of malicious network signatures

TRAFFICTHIEF: Raw SIGINT viewer for data analysis

BANYAN: NSA tactical geospatial correlation database

OILSTOCK: Air Force/Navy tool to track ships in real time

MAINWAY: Telephony metadata collection database

ASSOCIATION: Tactical SIGINT social network database

MESSIAH/WHAMI: Electronic intelligence processing and analytical database

MARINA: Internet metadata collection database

PINWALE: Internet data content database

SURREY:  Main NSA requirements database, where targets and selectors are “validated” by NSA managers

PROTON: SIGINT database for time-sensitive targets/counterintelligence

OCTAVE/CONTRAOCTAVE: Collection mission tasking tool

WRANGLER: Electronic intelligence intercept raw database

ANCHORY: Main repository of finished NSA SIGINT reports going back three years.

AQUADOR: Merchant ship tracking tool

So how do you get NSA to spy on someone? You send in an IN. An IN is an Information Need. The INs go into the collection requirements databases, like SURREY, and are evaluated. Are they time-sensitive? Are they critical, meaning intel is needed within three days? Do they fit in with the National Intelligence Priority Framework, which lays out in broad terms the targets the NSA is working on in any given year?

To invert a phrase from biology, in the intelligence community, function follows form. To begin to understand the NSA from the outside, you need to understand what it looks like from the inside. 

 

By Marc Ambinder // Marc Ambinder is senior Defense One contributor. A Los-Angeles-based writer who covers national security, Ambinder is also a contributing editor at The Atlantic, GQ, and writes The Compass blog for The Week. He is the author of “Deep State: Inside the Government Secrecy Industry,” and is working on a history of Cold War nuclear strategy.

August 14, 2013

The Atlantic

The Government Now Admits There’s an ‘Area 51’


National Security Archive / AP

PHILIP BUMP AUG 15, 2013

Newly declassified documents, obtained by George Washington University’s National Security Archive, appear to for the first time acknowledge the existence of Area 51. Hundreds of pages describe the genesis of the Nevada site that was home to the government’s spy plane program for decades. The documents do not, however, mention aliens.

The project started humbly. In the pre-drone era about a decade after the end of World War II, President Eisenhower signed off on a project aimed at building a high-altitude, long-range, manned aircraft that could photograph remote targets. Working together, the Air Force and Lockheed developed a craft that could hold the high-resolution cameras required for the images, a craft that became the U-2. Why “U-2”?

They decided that they could not call the project aircraft a bomber, fighter, or transport plane, and they did not want anyone to know that the new plane was for reconnaissance, so [Air Force officers] Geary and Culbertson decided that it should come under the utility aircraft category. At the time, there were only two utility aircraft on the books, a U-1 and a U-3. told Culbertson that the Lockheed CL-282 was going to be known officially as the U-2.

The next step was to find a place from which the top-secret aircraft could be flown.


On 12 April 1955 [CIA officer] Richard Bissell and Col. Osmund Ritland (the senior Air Force officer on the project staff) flew over Nevada with [Lockheed’s] Kelly Johnson in a small Beechcraft plane piloted by Lockheed’s chief test pilot, Tony LeVier. They spotted what appeared to be an airstrip by a salt flat known as Groom Lake, near the northeast corner of the Atomic Energy Commission’s (AEC) Nevada Proving Ground. After debating about landing on the old airstrip, LeVier set the plane down on the lakebed, and all four walked over to examine the strip. The facility had been used during World War II as an aerial gunnery range for Army Air Corps pilots. From the air the strip appeared to be paved, but on closer inspection it turned out to have originally been fashioned from compacted earth that had turned into ankle-deep dust after more than a decade of disuse. If LeVier had atrempted to land on the airstrip, the plane would probably have nosed over when the wheels sank into the loose soil, killing or injuring all of the key figures in the U-2 project.

That’s the first acknowledged mention of the Groom Lake site, according to Chris Pocock, a British author who’s written extensively about the program and provided his thoughts to the GWU archive. Nor, it seems, has the low-contrast image that accompanies that section (below) been seen. 

 

Rasmussen Reports

What They Told Us: Reviewing Last Week’s Key Polls

Saturday, August 17, 2013

President Obama may be on vacation this week, but he and his administration still made plenty of news.

Just before the president left for vacation, he announced in a press conference tighter restrictions on the National Security Agency’s domestic surveillance program.  But most voters still don’t trust the government to protect their constitutional rights, and very few expect the program to cut back on monitoring the phone calls of innocent Americans. 

While voters’ views of the president’s leadership have returned from post-election highs to levels seen for much of his first term in office, positive ratings for his handling of national security are at their lowest level in over three years of weekly tracking. 

More voters than ever believe that the United States is not spending enough on the military and national security, but they are also less aware of how much the country actually spends in this area. 

On Tuesday, Obama addressed the public about the crisis in Egypt.  A month ago, 73% said the United States should leave the situation alone. Rasmussen Reports will release new data on the public’s reaction to the situation early next week.

U.S. Attorney General Eric Holder announced a plan to scale back the number of strict minimum prison sentences for non-violent low-level drug offenders, and just over half of Americans are on board with this proposal. One reason for this support is the fact that 55% believe there are too many Americans in prison today.

Americans are still divided on whether or not to legalize marijuana, but most think it should be up to the states to decide, not the federal government.

The federal Department of Housing and Urban Development has announced plans to make the nation’s neighborhoods more diverse, but Americans strongly oppose this idea.  A plurality (49%) believes it is important for most neighborhoods in America to be racially or ethnically diverse, but just 15% think most neighborhoods in this country that are not racially or ethnically diverse are that way primarily because of racism.

Some members of Congress spend their time home during the August recess reconnecting with their constituents, but more voters than ever now believe a group of people randomly selected from the phone book could do a better job addressing the nation’s problems than the current Congress. 

But Republicans are more dissatisfied with their party leaders in Congress than Democrats are.  Thirty-eight percent (38%) of Republican voters think the average Republican in Congress shares their views, while 45% of Democrats believe the average congressional Democrat is about the same as they are.

Republicans and Democrats run even on the Generic Congressional Ballot for the week ending August 11.  Belief that U.S. elections are fair continues to fall and has now reached its lowest level in nine years of surveys.

Meanwhile, consumer and investor confidence remain near their highest levels in several years. 

But just 29% of voters say the country is heading in the right direction

In other news last week:

— Voters are closely divided over the Justice Department’s decision to contest the merger of US Airways and American Airlines, but nearly half predict airline fares will go up if the merger goes through.

— Forty-one percent (41%) of voters share an at least somewhat favorable view of the health care law, while 53% view it unfavorably.  But 53% also say the law has had no impact on their lives. 

Voters remain concerned about global warming, but they still express more urgency about the economy and don’t feel selfish for doing so.

— Sixty-eight percent (68%) of voters believe that policies that encourage economic growth are Very Important, compared to 49% who feel that way about policies that encourage economic fairness.  

Americans are becoming even less enthused about the Internet’s influence on American culture, politics and journalism.

— Some professional athletes just can’t seem to stay out of trouble, and most Americans are noticing.

— Most Americans (81%) believe that their fellow Americans pay too much attention to celebrity news, and 86% say the media covers those celebrities too much.

— Eighty-one percent (81%) of Americans believe that Amber Alerts are at least somewhat effective in abduction cases

Advertisements

From → Uncategorized

Comments are closed.

%d bloggers like this: