Skip to content

November 3 2012

November 5, 2012

3Nov2012

Newswire

 

 

Scientists, engineers push back on travel cuts

Federaltimes.com

October 25, 2012    

By ANDY MEDICI

Research and engineering groups are decrying proposed legislation that would severely cut agency conference and travel spending.

In a Sept. 10 letter to lawmakers and Obama administration officials, four professional associations representing science, technology and engineering specialists said proposed bills to cut travel and conference spending will curtail the ability of agencies and scientists to meet and exchange research and ideas. The associations are: Institute of Electrical and Electronics Engineers; U.S. Public Policy Council of the Association for Computing Machinery; Computing Research Association; and Society for Industrial and Applied Mathematics.

“Government researchers — as well as researchers funded by the government, and the program managers who determine funding — need to be able to participate in these effects,” the groups wrote.

The pending legislation — the GSA Act, the Data Act and the 21st Century Postal Reform Act — would all limit conference spending by agencies and put in place detailed reporting requirements for any conference spending.

The bills were introduced in response to an April 2 General Services Administration inspector general report detailing lavish and wasteful spending at an $823,000 2010 conference in Las Vegas that featured a mind reader and extravagant hotel room parties.

In May, the Office of Management and Budget directed agencies to cut travel spending by 30 percent in 2013 from 2010 levels.

But scientific and technical conferences already have review procedures in place to help control spending, according to the letter.

“We believe that existing oversight and review measures for scientific and technical conferences meet the spirit, if not the letter, of the proposed restrictions,” the organizations wrote.

 

Civilian force taking over gate security at Wright-Patterson

Dayton Daily News

Posted: 12:00 a.m. Monday, Oct. 29, 2012

Bottom of Form

By Barrie Barber

WRIGHT-PATTERSON AIR FORCE BASE —

A civilian Air Force police force will stand guard at entry gates around Wright-Patterson, replacing a private security contracting firm.

Military security forces have taken over the duties while the Air Force prepares to boost its civilian police officer ranks, according to base spokesman Daryl Mayer.

A final transition date is not set, but this will mark the first time in nearly two decades the base has had a civilian police force, he said.

“We have a few who have completed their training and are doing their duties,” Mayer said.

Unlike private security guards, the civilian force will have full police powers, including the authority to make arrests, conduct security patrols and issue traffic tickets, Mayer said.

“They’ll be able to do more things then simply work the gates,” he said.

In 2007, the Air Force hired Southeastern Protective Services, of Rock Hill, S.C., under a $5.1 million contract to guard base entrances until September. The contract covered between 80 to 100 private security guards, according to the Air Force. The civilian police force will have about the same number of officers.

“The goal is to have the cost be the same or less than previous contract costs,” Mayer said in an email.

Congress passed a law that ordered the termination of contracts with private security contractors from handling base security duties, with some exceptions. Under the National Defense Authorization Act for Fiscal Year 2003, lawmakers allowed the Defense Department to hire private security forces to meet increased security needs in the aftermath of the Sept. 11, 2001 terrorist attacks, according to Caitlin Dunn, a spokeswoman for U.S. Sen. Rob Portman, R-Ohio, a member of the Senate Armed Services Committee.

“The mentality at the Pentagon at the time was you wanted to use your warfighters to warfight and shift to contractors those other functions that you didn’t absolutely have to have somebody in uniform perform,” said Claude Chafin, a House Armed Services Committee spokesman. A recent trend at the Defense Department has been to return more roles to military jobs to save money, however, he added. The number of security contractors working at the Defense Department has been gradually reduced since 2010.

Private security companies bid on the base contracts with the understanding once they concluded returning service members deployed overseas would fill those positions, said Marty Melton, president of Southeastern Protective Services.

But he said the Air Force has hired some of his former employees for the new jobs.

“They did not hire them all, they did hire some,” he said.

In response, Mayer said the gate security jobs were always slated to be filled over the long-term with civilian personnel, not military security forces. Federal work rules dictate hiring rules for government jobs, he added. Veterans may receive preference in some instances based on military experience.

“This can give them an advantage over non-veterans,” Mayer said in an email. “However, it does not mean that only veterans will be judged most qualified for a position.”

Under a $128 million, nearly five-year contract, Southeastern Protective Services provided private security guards to 19 bases in the eastern United States. Melton estimated roughly half of a workforce of about 900 found security jobs with the government. But others who have found work have joined commercial security businesses, which usually offer lower pay and in some cases a complete lack of fringe benefits, he said.

The loss of work has had a “huge impact” for his company and other security contractors, he said.

“When the federal government takes work away from the private sector, that’s wrong … especially now in this economic time,” he said.

 

CQ WEEKLY – VANTAGE POINT
Oct. 27, 2012 – 2:05 p.m.

An Electoral Tie Could Bind the Senate

By Niels Lesniewski, CQ Staff

 
 

One of Washington’s favorite parlor games is conjecturing about the remote possibility of an Electoral College tie. Prognosticators have come up with various maps and scenarios under which the election would result in a 269-269 deadlock, which would vest the responsibility of choosing the country’s leaders squarely in what polls say is one of the least popular institutions in the country — Congress.

There’s little dispute about what would happen in the main event. Next year’s House would choose the president, with each state delegation casting one vote. Unless an unanticipated tidal wave arises Nov. 6 on behalf of Democrats in House races, Republican Mitt Romney stands to win, largely because the system would work to the advantage of smaller, rural states.

One question that has been debated in the past, whether some members would feel pressure to ignore their own party and vote for the winner of the popular vote in their state, seems to have faded as the result of gerrymandering and rising partisanship.

“That kind of extreme partisan polarization seems to rule over everything,” says Richard Arenberg, who worked on Capitol Hill for more than 34 years, including for former Senate Majority Leader George J. Mitchell, a Democrat from Maine. “So many of these members of the House have safe seats.”

“The Republicans have shown incredible discipline within their caucuses,” adds former Sen. Ted Kaufman, a Delaware Democrat who spent decades working in the Senate for Vice President Joseph R. Biden Jr.

The outcome might be less clear in the Senate, which in the event of a tie would be charged with picking the vice president, with each senator in the 113th Congress casting one vote.

One of the foremost experts on Senate rules says he sees no evidence of expedited procedures to avert a filibuster of that process.

“I have read the 12th Amendment to the Constitution, and I don’t see anything that requires the Senate to vote without debate on choosing a vice president,” says former Senate Parliamentarian Robert B. Dove. “Therefore, I don’t see what would stop senators from speaking about who is going to be the vice president and, in effect, forcing a cloture vote.”

While the parliamentarian advises the presiding officer on procedural questions, Dove says, the responsibility to rule rests with the occupant of the chair. In the event of an Electoral College tie, that would be Biden (in his capacity of president of the Senate, until Jan. 20). Dove notes that Democratic Vice President Hubert H. Humphrey disregarded the parliamentarian’s guidance with some regularity.

Kaufman says that while all these arcane procedural options exist, cooler heads should prevail.

“The mood of the American people would be very powerful,” Kaufman says. “I don’t think Congress is in a strong enough position” to go away from the electorate. Kaufman suggested that there might be pressure on the Senate to follow the popular vote, regardless of the Electoral College tally.

Another option would be to acquiesce and appoint the running mate of the president selected by the House. One precedent might be the 2000 election, when Democrat Al Gore accepted the ruling of the Supreme Court in the disputed Florida election results in the interest of national unity.

“In general, party unity within the White House makes sense. But there is no requirement, and we’ve had presidents of one party serve with [vice presidents] of the other,” says Vikram David Amar of the law school at the University of California, Davis. He has written about the history of the Electoral College.

Things could get rough if the Senate is tied 50-50. Based on the amendment’s requirement for a majority of senators, Dove says he thinks that the vice president could not break a tie.

The 12th Amendment was ratified four years after the infamous 1800 election, in which the House broke a tie between Thomas Jefferson and Aaron Burr, picking Jefferson on the 36th ballot. The House chose John Quincy Adams over Andrew Jackson in 1824, but John C. Calhoun won the Electoral College vote for vice president. The only time the Senate selected the vice president was after the 1836 election, when several state ballots had different running mates for Martin Van Buren. It chose Richard M. Johnson, a Kentucky Democrat.

 

 

Great Lakes boaters and shippers to likely face record-low water levels into 2013

Mlive.com

By Dave Alexander | dalexan1@mlive.com

updated October 31, 2012 at 4:07 PM

 

MUSKEGON, MI – Mariners – both commercial and recreational – are being warned that the Lake Michigan, Lake Huron and Lake Superior water levels are dangerously close to all-time record lows.

The U.S. Army Corps of Engineers Wednesday provided estimates that if the current trend of Great Lakes water level drops continue, the three upper lakes – Michigan, Huron and Superior – will hit historic lows later this fall or in early 2013.

The biggest factor in Great Lake water levels is the region’s climate, according to Keith Kompoltowicz, chief hydrologist for the Army Corps’ Detroit District Office. A combination of a mild winter and little snow last year with a hot summer and little rain this year has brought about the current lake conditions, he said.

“We are seeing much lower water levels than we had last year and that is the case all over the Great Lakes,” Kompoltowicz said.

Even the above average rain falls in the past month have not changed the direction of lake levels, Kompoltowicz said. The Great Lakes basin will need a typical winter with much more snow than last year to stem the likely, he said.

Low lake levels have a huge negative effect on commercial and recreational boating in many of the ports around Michigan’s Great Lakes coast line. Water level experts from the Army Corps Detroit District office point out that local, state and international economies are hurt by low water, along with environmental health, habitat and species preservation in the Great Lakes basin.

As of this week, a water level update from the Army Corps in conjunction with Environmental Canada show the Lake Michigan-Huron basin – treated as the same body of water – just two inches above the lowest October average recorded in 1964. Lake Superior was at its 1925 record low October average, federal officials report.

 

Lakes Michigan and Huron dropped another two inches from September to October and are 11 inches lower than this time last year. The two lakes are two feet, four inches lower than October’s long-term monthly average. The Army Corps water level records date back to 1918.

The Great Lakes have historically cycled over the course of years with water levels rising and falling. The historic high-water level was recorded in 1986.

The current historically-low water levels are due to several factors, Army Corps officials said. The main factors are natural causes of below average precipitation, above average evaporation rates and a below average amount of water available from recent winter snowpacks, especially in areas feeding Lake Superior.

Some have questioned past dredging projects in the St. Clair River as causing Lakes Michigan, Huron and Superior to drain more than natural.

 

New Networked LED Bulb Provides Wireless Light Control

1:30 AM – November 1, 2012 by Wolfgang Gruener – source: Philips


http://www.tomshardware.com/news/light-bulb-led-smartphone-app,18814.html

 

Philips just announced its smart light bulb, which can be programmed via WiFi connectivity and smartphone/tablet apps. Users can adjust the brightness as well as the color.

According to the company, the LED bulbs can display a variety of white tones ranging from warm yellow white to blue white, in addition to virtually any other color desired by the user. The technology used in the bulbs is related to Philips’ now defunct AmbiLight TVs, which allowed a device to adjust the surrounding TV lighting to the screen content.

Called hue, the smart light bulb starter kit is priced at $199 and comes with three bulbs and a network bridge that can control up to 50 bulbs at any time. Each additional bulb is $59 a piece. Via a smartphone, a user can apply a programmable timer, remotely control the lights or go to more extravagant usage models and take a picture of an object and apply the color tone to a light bulb.

According to Philips, there are several preset lighting modes that provide tones for specific environments, including calming tones for reading and relaxing.

 

Avoiding Financial Armageddon at the Post Office

The Atlantic
By Jeff Jordan

The USPS is an analog business being rapidly consumed by digitization. Here’s a plan to update and rescue it.

 

Reuters

It’s election season. And since Candy Crowley didn’t use my question in the second debate, I thought I’d ask it here: “President Obama and Governor Romney, the United States Postal Service is forecasted to lose $5.5 billion in 2012, and also has defaulted on scheduled payments of $11 billion. What would you do to fix it?”

Since the debates are now over, let me take a shot at answering it.

We have a thesis at Andreessen Horowitz that “software is eating the world,” such that analog businesses in a wide variety of sectors are being crushed by rapid digitization. In a recent piece, I discussed how this was playing out in retail, where market share gains by online retailers at the expense of offline retailers are threatening the long-term viability of many offline merchants due to their high operating leverage. This same dynamic is playing out in communications. The USPS is an analog business being rapidly consumed by digitization.

According to the Postmaster General, “The core function of the Postal Service is the physical delivery of mail and packages…to every address in America.” Physical = analog. Here is the USPS product mix in 2008 just four years ago:


It was an analog cornucopia. Half of their revenue (and even more of their profits) came from first class mail, but this personal and business correspondence is being rapidly replaced by digital email, texts, social networks, and online statements and bill-pay. Similarly, catalogs, magazines and newspapers are being replaced by commerce and content websites, supported by email marketing. I had a front row seat at eBay as money orders started getting replaced by PayPal. The only growth business that USPS has is packages due to the explosion of e-commerce. But unfortunately, this business has been relatively small for them (representing only about 14% of 2008 revenue) and highly competitive vis-a-vis UPS and FedEx.

So what has happened to the volume of USPS deliveries? It’s been decimated:


I have to give the USPS management credit–they saw pretty early on that a freight train was coming right at them. When I was managing eBay in the early 2000’s, the USPS management team organized a session with select Valley executives on what they could do to mitigate potential disruptions to their business due to digitization (I was likely invited because the eBay community in aggregate was one of their largest customers).

Any business that encounters disruption on this scale needs to respond decisively to remain viable. And the Postal Service is essentially a business: They are “an independent establishment of the executive branch that does not receive tax dollars for its operations.” But unlike privately owned businesses, “…the Postal Service is nevertheless restricted by laws that limit its ability to control costs and grow revenue in the way a business would.” (2011 Annual Report)

What are some of these restrictions? The USPS…

• Has a “universal service” obligation that mandates the delivery six days per week to a national footprint of 151 million homes and businesses

• Has labor agreements that specify cost-of-living wage increases and contractual benefit plans for employees and retirees

• Faces limits in raising prices

• Is “restricted by law from taking certain steps, such as entering new lines of business that might generate additional revenue…”

USPS management has attempted to navigate these restrictions to mitigate the financial impacts of digitization. They have been trying to control the costs they can, rapidly consolidating mail-processing facilities and adjusting their employee counts down in line with falling mail volumes. And they have raised prices. Unfortunately, the biggest price increases have been in packages and shipping, their most competitive market:


So what’s the net outcome of these actions? Massive and growing losses. Revenue is eroding rapidly as the price increases have only partially compensated for plummeting volume. And they’ve made very modest progress on lowering expenses:


And unfortunately, their situation is even worse than this. The USPS historically has not accrued for the cost of retiree health benefits; they’ve booked them on an as-spent basis (governmental entities are allowed to do this, unlike the private sector). And the costs they were spending paled compared to the costs they should have been accruing. Congress in 2006 mandated that USPS catch up on these obligations over a 10-year period, a process called “RHB Pre-Funding” (RHB stands for Retiree Health Benefits). They were able to make these Pre-Funding payments between 2007 and 2010, but their economic meltdown has caused them default on their recent obligations (Josh Barro of Bloomberg provides an excellent explanation of this issue here). The $11 billion in defaults, combined with their operating loss of $5.5 billion, is resulting in the disastrous $16.5 billion 2012 deficit.

USPS management has presented a plan for how they potentially can navigate their way out of this mess (USPS “Plan to Profitability”). It involves significant savings from changes in benefit plans, decreased service levels (e.g., moving from six to five-day delivery), some post office closures, and continued headcount reductions. It also involves continued price increases and identifies some revenue-boosting initiatives. It’s an ambitious plan, but it has a few big issues:

• About half of the financial improvements require require “significant legislative change”, and Washington hasn’t been very good lately on that front.

• It ignores the issue of the under-funded retiree health benefit costs.

• Most importantly, it assumes that their revenue erosion moderates significantly going forward. This just won’t happen; that digital “genie” is out of the bottle.

On the current course, we’re destined to see ever-larger losses as revenue continues to fall and expenses are only tweaked. The Postal Service is effectively becoming a taxpayer-supported entity, with ever-growing losses subsidized to maintain the “common good” of physical mail delivery.

But there are alternatives to mitigate the pain of this digital transformation:

Stem the Bleeding:

There are some very highly leveraged ways that you could improve USPS economics. But all require political will, so they’re not likely to happen any time soon:

Re-invent the post office. Operating and staffing 36,000 physical post offices is hugely expensive. And these post offices are being hollowed out, as volume going through the average post office is down 19% in the past four years alone. USPS needs to steal pages from the UPS and FedEx playbooks. Most physical post offices should be closed and replaced with self-service kiosks, supported by proven technology tools. These kiosks could be located in retailers, who would gladly trade a little space in exchange for foot traffic and possibly a revenue share. Closing post offices would save a fortune in operating and staffing costs, and the proceeds from selling the real estate could fund the benefits shortfall.

Deliver mail less often. Your local mail person is delivering 23% less mail to an average location today than four years ago. It doesn’t make sense to keep delivering progressively less mail with the same frequency. Cutting delivery down to three days a week, say Mondays, Wednesdays and Fridays, would halve delivery trips, but the average time to deliver something would increase by only a half day.

Restructure comp and benefits. Postal Service employees and retirees are expensive. I calculate current USPS manpower costs at $84k per employee (excluding those RBH pre-funding costs), compared to $67k and $70k respectively for UPS and FedEx workers (who have a higher proportion of skilled jobs like airline pilots and mechanics). It’s not intuitive to me why USPS labor deserve a 21-25% premium over their private sector counterparts. An enormous 80% of USPS costs are labor-related. If they paid the same labor rates as UPS (and that of course is a BIG “if”), their current $5.5 billion operating loss would swing to a multi-billion dollar profit.

Selectively raise prices. The cost of mailing a first-class letter was in the U.S. was $0.44 in 2011; comparable figures in other countries include Great Britain at $0.74, Germany at $0.77, Japan at $1.06 and Norway at $1.63. Giving the USPS more flexibility to raise selective prices could mitigate near-term financial pain.

Compete:

Package delivery is the only one of USPS’s market segments that is growing. But the packages business is very competitive, and USPS is getting its clock cleaned by UPS and FedEx. In my experience, many businesses start out using USPS given its ubiquity and low rates, but switch to UPS and FedEx as they grow due to their volume discounts and superior service. USPS needs to focus on packages, improving their service and having the latitude to offer volume discounts. They actually should have a cost advantage competing for an incremental package since they already make a trip to each destination each day.

Innovate:

A few private sector companies have in recent years explored trying to digitize your mailbox, providing consumers with a website that serves as a secure virtual mailbox to which mailers could send digital versions of the mail. Physical mail is very expensive for businesses–in addition to billions in postage costs, there are even larger costs in paper, printing and processing. As a business, I’d be willing to pay a fee to deliver electronic correspondence at scale if it saved me big bucks in physical production and delivery. And as a consumer, I would dramatically prefer to check a website for my mail than make the daily trek to the mailbox.

The challenge in building this service is the classic chicken and egg problem–it’s only interesting to consumers if many businesses use it, and it’s only interesting to businesses if many consumers use it. USPS may be uniquely positioned to solve this chicken and egg problem with their scale, universal reach, and relationships with both consumers and businesses. They could charge much lower “postage” rates for this service, but at higher margins as they replace the costs of physical distribution. And they could provide related services like integrated bill payment, archiving and the like. In the process, they also save about a zillion trees!

Privatize:

UPS and FedEx currently deliver packages to every home and business in the U.S., just like USPS. They don’t currently go to every one of these everyday, but they are in every neighborhood every day. And both of these businesses are operated significantly more efficiently than the Postal Service and turn a profit. It feels like there would be massive efficiencies from combining USPS operations into one of these companies. Alternatively, there’s a war that’s starting to brew around delivery to your home from players including Amazon and Wal-Mart, and both are world-class at logistics. Amazon alone spent a stunning $4 billion in 2011 on outbound shipping costs; they now spend more on shipping than marketing! Might they be interested in owning delivery to the home?

Software is eating the world, leading to the rapid destruction of many legacy analog business models. It’s a foregone conclusion that the post office will go the way of record and book stores, as bits don’t require physical buildings to be delivered. The Postal Service as we know it is well along the path of being obsolete. Acting on this reality sooner rather than later will save taxpayers very many billions of dollars.

http://www.theatlantic.com/business/archive/2012/10/avoiding-financial-armageddon-at-the-post-office/264221/

 

 

Arizona Proposal Could Dramatically Change Primaries

Governing.com

Posted By Ryan Holeywell | October 29, 2012

 

Arizona could drastically change the way its voters choose elected officials by throwing away the traditional primary system in a move that threatens the power and influence of the state’s political parties.

On Election Day, Arizonans will decide the fate of Proposition 121, which would create a “top-two” primary system.

Instead of candidates being chose from traditional party primaries, all the candidates in a race – regardless of party affiliation, or lack thereof – would appear on a single ballot. The top-two vote getters would advance to a general election. Similar systems have been adopted in recent years in California and Washington.

Supporters say that the reform will allow elected officials to take positions that are in the best-interest of their districts, without fear of consequences from political parties that put them in office. They say it will also have a moderating impact on politicians by making the party bases less influential and instead bringing power to the broader electorate. As one author put it succinctly, top-two primaries have the potential to reduce the “jackass quotient” of an elected body.

In Arizona, the push for a top-two primary is being led by Paul Johnson, a former Phoenix mayor and unsuccessful gubernatorial candidate. He says independents are the fastest growing cadre of voters in the state, yet they still have a diminished role in the political process since so many legislative districts have been gerrymandered so that they’re “safe” for one party or the other in the general election. Essentially, the real contest is in the primary, where candidates are accountable to their party base and not their constituents at-large.

“At the end of the day, elected officials are generally a reflection of the people who vote for them,” Johnson says. “If you create an incentive system for them to be re-elected that says ‘you’ll never have to talk to Democrats or Republicans or Independents,’ don’t be surprised when they’re more extreme.”

That creates a system where politicians advance policies that please the party faithful but don’t reflect the wishes of the electorate as a whole. A perfect example of that, Johnson says, is the state’s controversial immigration law, which he considers an attempt to pander to a conservative base.

After the law passed, Russell Pearce, the Republican state senator who was its sponsor became the first legislator in state history to be removed from office when he lost a recall election. A Republican seen as more moderate took his job.

Johnson says just like the existing system promotes extremism, reform would have a moderating impact on politicians. He and other supporters of top-two primaries argue that with candidates accountable to the electorate at-large – not just their party base – they’ll have to take more reasonable, measured positions.

“Through no organized effort, somewhere near 40 percent of the public has left the two parties and registered as ‘other,'” Johnson says. “That’s a sign of a mass movement. They’re not happy. They don’t believe the existing two-party system is working.”

But critics of the plan — including the governor — have adamantly defended the status quo. Gov. Jan Brewer has called Johnson’s proposal “an attack on our election process itself.”

Critics say the move would reduce voter choice, since in some cases, a general election could actually have two candidates from the same party.

They also warn that the candidates in the general election might not represent the views of a district. For example, a heavily conservative area could have many Republican candidates and just two Democrats in a primary. Hypothetically, those Republican candidates could split the vote, leading to a runoff consisting of just Democrats, even if Republicans in the aggregate had more votes.

The change would affect all federal, state, county and local elections – except for the president and vice president – starting in 2014. Politicians running in the open primaries could declare themselves as members of any political party – or no political party – in a move that also threatens traditional power bases.

Critics of the plan say that aspect is especially troubling, since it would become difficult to know who’s truly a Republican or Democrat. They also suggest that the reforms attempt to solve a problem that doesn’t exist, since independents can vote in party primaries.

Tim Sifert, a spokesman for the state Republican party, says that if the system is implemented, it’s unclear how the parties would respond. Sifert says Republicans could create a closed caucus to select the candidate it will back, but that would be less transparent than the primaries, and it’s a situation the party wants to avoid.

 

Get ready for a seismic shift in federal IT

NextGov

By Joseph Marks

November 1, 2012

 

Four years can change a lot in federal technology. As the Bush administration prepared to hand power to the Obama team in late-2008, no mission-critical government systems were housed in computer clouds, there were fewer than two dozen government-built mobile applications and the nation had never had a chief information officer.

Now new systems are moving to or being built in the cloud every month, there are more than 100 government apps and as for federal chief information officers—there have been two.

Four years ago, the idea that federal information technology contractors should provide three-, six- and 12-month deliverables rather than a fully completed system years down the road—called agile development—was gaining traction among some technology leaders, but was not a governmentwide mandate. And digital transparency was mostly about tools that sifted through email archives to respond to Freedom of Information Act requests, not the 400,000 public data sets agencies have now proactively posted online.
Changes of similar magnitude are sure to reshape federal IT during the next four to eight years, experts say, spurred on by shrinking agency budgets, the developing promise of cloud storage and advances in commercial technology that are influencing what both federal employees and citizens expect of government.

“It really doesn’t matter whether Romney wins or Obama wins, the budget pressures are going to be severe and agencies are going to have to figure out ways to get far more efficient than they are now,” Darrell West, director of governance studies at the Brookings Institution, said in mid-October. “The private sector has figured out ways to get more efficient by automating routine aspects of their organization and they’ve flattened organizations by cutting out a lot of mid-level managers. That involves using technology but it’s also about making organizational changes. You haven’t seen that as much from government yet.”

There is no crystal ball into the future of federal technology, of course, but the short list of emerging trends includes a seismic shift in the role of chief information officers, more collaborative technologies and a workplace that moves beyond ‘bring your own device’ to ‘bring your own everything.’

 

Trend 1: Empowering the CIO

First off, the next presidential term or two likely will shift the balance of power, both within the federal technology shop and between technologists and other workers. This could involve concentrating authority—including budget decisions—in the hands of agency CIOs. It is certain to involve broader responsibility as consumer technology becomes ubiquitous in government, as more employees work remotely on cloud-based platforms and as IT becomes another participant in nearly every government interaction—from videoconferences to shared documents to social media conversations.

More technology spending authority would fall to agency CIOs if Congress passes the Federal Information Technology Acquisition Reform Act, sponsored by Rep. Darrell Issa, R-Calif., chairman of the House Oversight and Government Reform Committee.

Issa’s proposed legislation, among the most significant federal IT rewrites since the 1996 Clinger-Cohen Act, also would shift the responsibility for certain specialized technology purchases to agency-based centers of excellence that have proven track records with those types of contracts.

If the reforms become law, they would lead to more careful monitoring of IT spending and a higher profile for CIOs among agency leadership, says Steve Charles, co-founder and executive vice president of ImmixGroup, a federal technology contractor.

 

“I think over the next 10 years we’re going to see a movement toward the CIO working more closely with the [chief financial officer] and the secretary of the department, all getting their arms around the IT spend,” Charles says. “More centralization means that there would be more adherence to the IT architecture. You’d have fewer pockets of people buying trinkets and toys and doing their own thing.”

In the current structure, only Veterans Affairs Department CIO Roger Baker has control over his agency’s IT budget. Most technology watchers agree that authority has been vital to his success in turning one of government’s most inefficient and wasteful IT shops into one of its more functional.

Any bill introduced into a Congress as beset by partisan divisions as the one beginning in 2013 is sure to face long odds. For Issa’s plan the “elephant in the room,” Charles says, is granting power over agency IT budgets to CIOs and wresting some of that power from House and Senate appropriations committees.

The bill does have a few points in its favor, however. First, there’s Rep. Gerry Connolly, D-Va., ranking member on the Oversight subcommittee that handles federal technology. Connolly has endorsed many of the bill’s underlying ideas and has said he’d like to work with Issa on the final product. Second, there’s a consensus across much of government, and in Congress too, that more efficient technology is about the only way agencies will be able to cut costs—a primary goal of both parties—without significantly reducing services. Smarter technology alone won’t be enough to deal with the steep budget cuts President Obama and congressional leaders agreed to in 2011, but it could prove the least painful part of any budget compromise.

Even if the IT Reform Act doesn’t become law, technology shops are likely to change in other ways, experts say, as a generation of professionals who grew up with Internet-based and cloud-based tools enters government service.

There are also new career paths in IT program management and records management mandated by the Office of Management and Budget. The industry group TechAmerica has advocated for a similar career path in data analysis.

Hiring will become easier as these IT posts become formalized, says Julie Anderson, a former deputy assistant secretary at VA and now chief operating officer at Civitas Group, a consultant for federal contractors. More important, she says, they will help new employees see a clear career path in government, which in turn will make it easier to retain talented staff.

“Particularly when young people come into government, they want that level of professional development and support,” Anderson says. “They may or may not be working for the world’s best manager in government, so the more information they have access to and the more process they understand, the higher the probability that they’ll be able to navigate the bureaucracy and turn their career path into a reality.”

 

Trend 2: Collaborative Tools

The face of federal technology likely will be changed by collaborative systems developed in the consumer sphere that are just now making their way into government.

These include social networks aimed at replacing interagency emails and phone calls with group chats, badges and polls. The General Services Administration already is using Chatter, a workplace social network from Salesforce.com, for some of its internal communications. The National Nuclear Security Administration plans to roll out a workplace social network next spring that ultimately could be adopted by the entire Energy Department.

One of the chief draws of a workplace social network is that problems thrown out to the crowd could pull in innovative answers from unexpected sources, Charles says. That could help break down agency silos and allow the most committed and talented workers to rise more easily through the bureaucracy.

In other cases, workplace collaboration may simply mean employees taking advantage of technologies such as Gchat, Google’s instant messaging program, and Google Drive, the collaborative document writing program, both of which come with the Google Apps for Government email systems, which GSA and other agencies recently adopted.

Collaborative document writing, videoconferencing and other adaptive technologies can facilitate cooperation between agencies or with industry and nongovernment groups, says Dan Chenok, vice president for technology strategy at the IBM Center for the Business of Government and former branch chief for IT policy at the Office of Management and Budget.

“At OMB, you deal with negotiated rule-makings where people come together and talk through all the issues and agree on a consensus for what the language ought to look like,” he says. “This is all done in large groups where everyone flies in. That may still need to occur when you’re dealing with some big, complex issues; you’re never going to lose the value of interpersonal relationships. But I think for a lot of issues you’ll begin to see more collaborative development as a way to save money and increase input and buy-in by a larger community.”

The goal for technology managers should be finding ways in which collaborative technologies can enhance communications without becoming a burden or an ill-fitting mandate, says Darren Ash, president of the American Council for Technology, part of ACT-IAC, an industry-government partnership with the Industry Advisory Council.

“People have different ways of working within the federal workforce,” says Ash, who is also CIO at the Nuclear Regulatory Commission. “So you have to design an ecosystem that accommodates all the different ways people have of learning and all the different ways people have of collaborating. Agencies need to think not necessarily about the particular collaboration tools but about the culture of their organizations. Will we really be able to foster and nurture collaboration with these tools? Will the culture adapt to them? Will the employees adapt to them and will they use them?”

 

Trend 3: Bring Your Own

The next four to eight years in federal IT will be defined by what workers increasingly lack, experts say, including agency-issued phones, tablets and even laptops for telework. “Bring your own device” policies already have been rolled out at some agencies, mostly on a voluntary basis. As the years go by and agency-issued BlackBerrys and other product lines wear out, those policies will become increasingly common and, in some cases, mandatory, experts say.

The most basic reason is efficiency.

“Apple just released the iPhone 5,” Ash says. “In the Android market it seems like a new model comes along every month. We can’t keep up.”

But there’s a deeper reason too. Just as the government doesn’t provide most employees with cars or Metro passes, some argue there’s no fundamental reason it should spend energy and resources providing them with cellphones when they’re perfectly willing to get one for themselves. In fact, many would rather choose the product they want. It’s a trope of technology conferences that many federal employees carry a government-issued BlackBerry in one pocket and a personal iPhone or Android in the other. Agencies have said they might offset this new employee burden with a stipend.

But resistance to a BYOD policy has been twofold. First, officials don’t trust iPhones and Androids to keep government information secure from hackers. Second, agencies aren’t sure they trust employees to not leave devices full of sensitive government documents sitting unsecured in the back seats of taxis or let children play Angry Birds on them.

 

Employees, for their part, don’t much like the idea of giving security officials power over their devices, which can be the equivalent of a personal diary, a calendar and a family photo album wrapped into one.

Officials are likely to crack these problems over the next administration or two, experts say. At the center will be a robust identity management system—one that knows who you are and where you are and that can keep work and personal information separate, says Karen Evans, e-government director during the George W. Bush administration. On the work side of the device, security officials would be able to control settings and remotely wipe away any data if the device was compromised, says Evans, who is now national director for the U.S. Cyber Challenge, an organization devoted to bringing more technology students into the public and private cybersecurity workforce.

An effective identity management system also would require a mobile-based biometric identifier, such as a fingerprint or iris scan, to reach classified or sensitive information, IBM’s Chenok says. That technology isn’t commercially available and there are no government standards for it, but it should be on its way within the decade, he says.

With BYOD, as with collaborative technologies and other advances, four or eight years might be long enough for the beginning of a culture change, but government moves like a battleship, not a speedboat, warns Charles at ImmixGroup. It will take a lot of pilot projects and a lot of time before the ship of state changes course entirely.

“Look at how long it took for management to start to take seriously the idea that you can legitimately telework,” he says. “That took five or six years at least. The same thing will happen with BYOD or BYO anything. The activities from one shop to the next are so disparate. The idea the whole government is going to wholesale change how everyone operates is ridiculous. The short answer is it’s going to happen piecemeal.”


http://www.nextgov.com/cio-briefing/2012/11/get-ready-seismic-shift-federal-it/59185/

 

Is the United States Militarizing Cyberspace?

Forbes

Sean Lawson, Contributor

11/02/2012 @ 6:00AM |974 views

At its most basic level, militarizing cyberspace would involve “giv[ing] a military character to” it, “equip[ping][it] with military forces and defenses” and/or “adapt[ing][it] for military use.” Determining the degree of militarization would also hinge upon determining the degree to which the military as an institution is the primary actor in responding to cybersecurity challenges. During the last four years, we have seen all of these occur alongside a growing role for the military in responding to the nation’s cybersecurity challenges. This is despite assurances that the United States was not militarizing cyberspace.

 

On a number of occassions, civilian and military leaders have assured us that the United States was not militarzing cyberspace and that the military was not taking the lead in responding to cybersecurity challenges. This has at times caused controversy. In the spring of 2010, Senator John McCain (R-AZ) called for a greater role for the Department of Defense in national cybersecurity, a call that the head of USCYBERCOM, Gen. Keith Alexander, seemed to resist.

Several months later, in summer 2010, concerns about possible militarization were raised again when the Wall Street Journal reported on a National Security Agency (NSA) program called “Perfect Citizen.” The program would allegedly involve NSA surveillance meant to detect cyber attacks against private companies and civilian critical infrastructures. Reports that a contractor involved with the program had called it “Big Brother” only added to concerns. In response, the NSA assured us that “Perfect Citizen” was only a research and development project and that the military was not taking over civilian cyber security.

These assurances did not allay concerns about militarization. A year later, in his July 2011 speech introducing the first Department Of Defense Strategy for Operating in Cyberspace, Deputy Secretary of Defense William Lynn III felt the need to address “concerns that cyberspace is at risk of being militarized.” He told his audience, “We have designed our DoD Cyber Strategy to address this concern.” This involved an “emphasis on cyber defenses”–as opposed to offense or retaliation–that was meant to illustrate DoD’s committment “to protecting the peaceful use of cyberspace. […] Indeed, establishing robust cyber defenses no more militarizes cyberspace than having a navy militarizes the ocean.”

The first problem with Lynn’s assurances, however, was that this very same strategy and his speech to introduce it provided one more piece of the militarization puzzle. In his speech, he noted that the new DoD strategy would treat “cyberspace as an operational domain, like land, air, sea, and space. Treating cyberspace as a domain means that the military needs to operate and defend its networks, and to organize, train, and equip our forces to perform cyber missions”–i.e. “giv[ing] a military character to” it, “equip[ping][it] with military forces and defenses” and “adapt[ing][it] for military use.” While assuring listeners that DoD did not intend to militarize cyberspace, he all but used the dictionary definition of “militarize” in describing the intent of the new strategy.

The second problem was that the speech and strategy did not address the other aspects of cyberspace militarization that we knew had already taken place by July 2011. The creation of a dedicated military command certainly constitutes “equip[ping][it] with military forces and defenses.” We also already knew that these military cyber forces would have an offensive as well as defensive mission–i.e. that they were “adapt[ing][cyberspace] for military use.” But as I noted at the time, neither the DoD strategy nor Lynn’s speech addressed the issue of how our military cyber forces would be used offensively, making the strategy both disingenuous as a PR document and incomplete as a strategy document.

Almost one year later still, in June 2012, we learned that even as Lynn and others assured us that the DoD strategy was primarily defensive, the United States had been conducting covert, offensive cyber attacks against Iran. Cyberspace, which our leaders had characterized as a miltiary domain, had been successfully adapted for military use by our dedicated cyber forces. Again, this is the definition of militarization.

But what of the question regarding the military’s role in national cybersecurity? Senator McCain has not been alone in calling for a greater military role. For example, a former Chairman of the Joint Chiefs of Staff has said that the military should be in charge of cybersecurity and should have more of an offensive orientation.

And there are indicators that the military’s role is, in fact, increasing. In August 2012, the Washington Post reported:

 

“The Pentagon has proposed that military cyber-specialists be given permission to take action outside its computer networks to defend critical U.S. computer systems — a move that officials say would set a significant precedent. […]

The proposed rules would open the door for U.S. defense officials to act outside the confines of military-related computer networks to try to combat cyberattacks on private computers, including those in foreign countries.

What’s more, the Washington Post report indicates that this proposed expansion in the role of military cyber forces in the domestic space “is significantly narrower than what the military originally sought.”

 

Finally, James Lewis, a leading cybersecurity expert at the Center for Strategic and International Studies, has recently said that an October 2012 speech on cybersecurity by Secretary of Defense Leon Panetta indicates that the Department of Defense is effectively taking over U.S. cyber security.

If the United States has not yet fully militarized cyberspace, it has taken significant steps in that direction. Official policy has characterized it as an operational miltiary domain, has equipped it with military forces, and has adapted it for military use. We hear powerful voices calling for an expanded military role in civilian cybersecurity and see strong indicators that, at minimum, the military will indeed play an increasingly important role, perhaps even the central role in our nation’s cybersecurity.

 

What’s the price of a new Windows 8 zero-day vulnerability?

French security company Vupen is selling a vulnerability in Microsoft’s latest operation system and browser

Computerworld

By Jeremy Kirk

November 2, 2012 12:31 AM ET

 

IDG News Service – It’s not exactly the type of advertisement most people would understand.

For sale: “Our first 0day for Win8+IE10 with HiASLR/AntiROP/DEP & Prot Mode sandbox bypass (Flash not needed).” It’s part of a recent message on Twitter from Vupen, a French company that specializes in finding vulnerabilities in widely used software from companies such as Microsoft, Adobe, Apple and Oracle.


Vupen occupies a grayish area of computer security research, selling vulnerabilities to vetted parties in governments and companies but not sharing the details with affected software vendors. The company advocates that its information helps organizations defend themselves from hackers, and in some cases, play offense as well.

Vupen has found a problem somewhere in Microsoft’s new Windows 8 operating system and its Internet Explorer 10 browser. The flaw has not been publicly disclosed or fixed by the company yet.

Vupen’s finding is one of the first issues for Windows 8, released last week, and Internet Explorer 10, although vulnerabilities have since been found in other third-party software that runs on the Windows 8.

Dave Forstrom, Microsoft’s Trustworthy Computing director, said the company encourages researchers to participate in its Coordinated Vulnerability Disclosure program, which asks that people give it time to fix the software problem before publicly disclosing it.

“We saw the tweet, but further details have not been shared with us,” Forstrom said in a statement.

Vupen’s Twitter message, written on Wednesday, implies the vulnerability would allow a hacker to bypass security technologies contained within Windows 8, including high-entropy Address Space Layout Randomization (ASLR), anti-Return Oriented Programming and DEP (data execution prevention) measures. The company also indicates it is not dependent on a problem with Adobe System’s Flash multimedia program.

 

“Certainly, if the bug is confirmed, then this could be a black eye for Microsoft having their brand new and touted most secure platform already found flawed just after its public release,” said Andrew Storms, director of security operations for nCircle.

The market opportunity for a successful exploit may be limited due to the recent release of Windows 8, but “on the other hand, nobody has confirmed this bug isn’t also functional on older version of Windows or IE,” Storms said.

Jody Melbourne, a penetration tester and senior consultant with the Sydney-based Australian security company HackLabs, said the vulnerability could be useful to third-party Microsoft developers interested in stealing code-signing certificates or source code.

So what’s the vulnerability worth? It’s hard to say. Vupen doesn’t publish a public price list. But Melbourne said “the value of the bug will only increase with time, of course, the longer Vupen sits on it and if no one else stumbles upon it.”

 

Cyber rules of engagement still unfinished

FCW.com

By Amber Corrin

Nov 01, 2012

 

A successful cyber attack could damage the nation’s power grid and other critical infrastructure, but the rules of engagement needed to shape a military response are incomplete.

For months, Defense Department officials have been at work establishing the rules of engagement that will govern military action in cyberspace. Despite recent assurances of their imminence, those rules have yet to be finalized, and some say their absence could affect national security.

Defense Secretary Leon Panetta said in October that Pentagon leaders are close to finishing the rules of engagement, an official doctrine that will likely echo in structure the guidelines that regulate armed conflict. But will that be enough to address, on a national level, the constantly evolving landscape in cyberspace?

“The DOD rules of engagement are important, but there’s an issue of whether or not that’s suitable for era we’re in,” said Tim Sample, vice president and sector manager for special programs at Battelle, If the rules are modeled after the traditional rules and assumptions for conventional combat, that would be based on a ‘faulty assumption,” he said. “It can be argued whether that’s practical [to have] a traditional response to attacks in an era or environment that is by definition very untraditional.”

Sample co-edited a book from Battelle on national cyber policy that is due out soon, called “#CyberDoc: No Borders, No Boundaries.”

According to Panetta, the cyber rules of engagement will make the U.S. military more agile and quicker to respond to cyber threats.

“For the past year, [DOD] has been working very closely with other agencies to understand where are the lines of responsibility when it comes to cyber defense. Where do we draw those lines? And how do those responsibilities get executed?” he said at New York event. “As part of that effort, the department is now finalizing the most comprehensive change to our rules of engagement in cyberspace in seven years. The new rules will make clear that the department has a responsibility, not only to defend DOD’s networks, but also to be prepared to defend the nation and our national interests against an attack in or through cyberspace.”

 

One challenge for establishing rules of engagement is that much of the critical infrastructure under threat — the nation’s power grid, financial networks and other essential systems — are privately owned. Panetta has said repeatedly that collaboration with the private sector is crucial to create a policy that will protect the infrastructure.

At the Association of the United States Army conference in Washington last month, Lt. Gen. Don Campbell, commanding general of the Army’s III Corps and Fort Hood, Texas, echoed the need for military rules for cyber operations.

“How far can we go to target this network or that network or capability or system? We’re going to have to decide as a service or military,” Campbell said.

According to Sample, the U.S. needs a broader strategy for national security in cyberspace that encompasses more than only the military. Today’s national security policies are rooted in the Cold War era – a bygone era outdated by the vulnerabilities of reliance on technology.

“The national-level doctrine that we had – defined as a priority in direction that guides government and industry and even individual citizens – really was doctrine of containment after World War II. After the Cold War, containment fell by wayside and never really was replaced,” Sample said. “Now, the stakes are higher. We are much more vulnerable because of our reliance on the Internet and technology. We’ve created a myriad of positions, policies and legislation to evaluate vulnerabilities as they emerge. The problem is that it’s basically unconnected; it’s lacking in the overall focus that we’d have in a national doctrine.”

While DOD’s rules of engagement will direct military action in cyberspace, a wider national-level doctrine should lay the groundwork for a more comprehensive range of issues related to cybersecurity, from federal guidelines to everyday activities, Sample said.

For example, in the Cold War era there was an understanding of responsibility at the individual citizen level – take, for example, the drills in which schoolchildren would take shelter under their desks in preparation for a nuclear bomb. While that activity itself is an obsolete idea now, it represents the reach a national doctrine should have, Sample noted.

“The difference [from the military ROE] is that the doctrine, in essence, is an overarching structure or framework that everything else should be fitting into. It’s not just DOD, Homeland Security or the intelligence community; it really is laying out a path and direction for where we want America to move in this cyber era,” he said. “We need to start out by laying down overriding principles…not just in cyber but where the country goes and what our long-term objectives are. We have to have a doctrine that is adaptable and agile. Having an override principle and doctrine will help guide us in a way more effective than what we’re doing today.”

 

Lenovo hopes to reach US customers faster with ‘Made in USA’ computers

By Agam Shah, IDG News Service

Oct 11, 2012 6:50 AM

 

Lenovo hopes that computers made in its first U.S. manufacturing facility will attract more buyers, while also making the delivery of ThinkPad laptops and tablets faster to U.S. customers.

The company, which is based in China, earlier this month announced it would open a factory to make computers in Whitsett, North Carolina—its first such facility in the U.S. Lenovo said the factory would create about 115 manufacturing jobs and a spokesman later added that the company may expand the facility in the future, which could create more jobs.

 

Manufacturing in the U.S. will help Lenovo get its products to customers more quickly, said Peter Hortensius, senior vice president of the product group at Lenovo, in an interview at a company event in New York this week.

The company will manufacture ThinkPad laptops and tablets starting early next year, and with the new factory, Lenovo hopes computers could reach customers within a week, or in some cases, overnight. But initial supplies of products like the ThinkPad Tablet 2, which will become available in October, will not be made in the U.S. factory.

Many Lenovo computer shipments originate from China and are supposed to reach customers in 10 days, but in some cases take weeks. The company also has factories in Japan, Brazil, Germany and Mexico.

The “Made in USA” tag on computers manufactured in North Carolina will resonate with some buyers, Hortensius said. Lenovo’s main U.S. operations are in that state, and the company also has a distribution center there.

“We decided the time was right in the U.S. because what we gain is speed. We’re really close to the customer. We can provide services and customizations that frankly if I’m doing it an ocean away it’s hard to do that,” Hortensius said.

But the quicker delivery of laptops or services may depend on what the customers are willing to pay. Components are made and shipped from other countries, which could increase the cost, Hortensius said.

“The reality is if you take this [laptop] apart, you’ll see parts from everywhere. Designed in different places, built in different places. It’s a very international, very Asia-based supply chain, no doubt about that,” Hortensius said.

However, the company has taken that into account. “You have to make the thing make sense economically on its own. We think we have a way to do that, and as we wrap that factory, we expect to be able to show that,” Hortensius said.

Lenovo is the world’s second-largest PC maker behind Hewlett-Packard with a strong market presence in Asia-Pacific and Europe. The company has been growing in the U.S. on a distribution channel expansion in recent years.

The company is also trying to adapt to regional trends when building computers, Hortensius said. For example, in Germany and China there is a big demand for discrete graphics cards, which is not the case in the U.S. Lenovo has also launched smartphones in countries including Russia and China, but has no plans to launch handsets in the U.S.

Lenovo earlier this week launched a range of hybrid devices that function as both a PC and a tablet. The devices are based on Windows 8 and RT operating systems, which will ship in computers later this month.

Lenovo is also building up its cloud service to supplement the new devices. The company talked about its cloud service in January this year and in mid-September acquired Stoneware to boost its cloud computing offerings.

“We’re planning to roll out some tests of that towards the end of the year,” Hortensius said. “I use it in the office today.”

The products offered by Stoneware are similar to the cloud capabilities being developed by Lenovo, Hortensius said. Lenovo’s cloud service will have two elements: one tied to calendar, contacts and syncing, and the second tied to accessing applications in the cloud from any device via a browser interface.

The Stoneware program is similar to Lenovo’s products under development, but smart enough to know where the content is, and to take advantage of local device resources such as a graphics processor.

“We’re very excited,” Hortensius said.

 

PEO stresses importance of relationships

Posted 11/1/2012 Updated 11/2/2012

by Patty Welsh

66th Air Base Group Public Affairs

 

11/1/2012 – HANSCOM AIR FORCE BASE, Mass. — The program executive officer for C3I&N provided an update on his organization and emphasized working together during a presentation to an industry group at the Minuteman Commons here Oct. 25, 2012.

Speaking to approximately 100 attendees, Maj. Gen. Craig Olson told a personal story about working with personnel in Iraq and how the differing parties came to rely on one another.

“Trust-based relationships are the foundation of the work we do together,” he said.

Olson emphasized how this is equally important between government and industry partners, saying that he has been blessed with many such relationships throughout his career.

“You spend a couple of months working together, building a relationship, talking about your family, hobbies, and once you get that trust, the business aspect takes off and everyone benefits.”

By using this tactic, with the varying parties working toward a common goal, personnel can be upfront and transparent about confronting issues.

“There should be nothing that we can’t work through together — no technical issue, no relational issue, no program or schedule issue,” the general said.

Another topic he spoke about was how well the PEOs within the Air Force Life Cycle Management Center are working together.

“Cross-PEO integration is now more important than ever before,” he said, stressing how the mission areas are all connected. “There can be no walls between the PEOs.”

For the 10 PEOs under AFLCMC, he said he sees them coming together more than during his previous 20 years in acquisition.

After presenting a chart depicting LCMC’s mission and objectives, Olson spoke about how he will be working to refine his own organizational focus.

“We’re a relatively new organization,” he said. “We’ll be looking at what our objectives and metrics should be, while making sure they’re in alignment with AFLCMC and AFMC [Air Force Materiel Command].”

The general talked about how the divisions are currently organized, adding that they cross the differing domains of ground, air and space. He said that he will be working with “the great leaders in the organization” to ensure that what is put in place optimizes the best way to get the mission accomplished.

Olson added that he is dealing with a new set of customers, as he did not “grow up” in the command and control, space, communications and IT arenas, but he is learning more all the time.

 

He also stressed resource constraints, adding that because of them, programs will be limited to the required amount of capability at the time needed.

“This is the way our customers are looking at planning and programming, and we need to do the same in the PEO.”

In order to meet the goals of being efficient and effective, Olson said there needs to be a commoditized infrastructure.

“We can’t have “each’s” — unique services for each mission and asset,” he said. “We need a simplified set of services on a commoditized infrastructure which benefit multiple missions and assets.”

Olson is convinced that industry partners can assist the C3I&N organization with establishing and successfully carrying out its mission as they have gone through similar struggles, especially the IT industry.

However, he knows none of it can happen without partnerships.

“As in previous positions, my focus will continue to be on building good, strong, trust-based relationships,” he said.

 

 

‘Airman’ magazine enters digital era

 

Posted 11/1/2012 Updated 11/1/2012

11/1/2012 – FORT GEORGE G. MEADE, Md. (AFNS) — A year after publishing the final print edition of Airman magazine, the Air Force is unveiling a new digital version of the service’s official magazine specifically geared for tablet devices.

“Built specifically for Apple and Android tablet users, this new format gives readers all the in-depth feature stories and spectacular photos they’ve come to expect from Airman magazine and adds layers of multimedia content and interactivity only available through new publishing techniques,” said Preston Keres, Airman editor-in-chief. “We are very excited to release this first new issue, and begin a new era for telling the Airman’s story.”

In addition to stunning photography, compelling copy, great video, interactive content and design, Keres said the staff is proud that they are able to bring back the monthly publishing schedule.

For the past year, the Airman editorial staff and design team have posted original content to the Airman website at http://airman.dodlive.mil , which will remain current after the launch of the new tablet magazine. The website will continue as the home of Airman and will feature extra and exclusive staff-produced content, as well as strong feature stories, images and multimedia from the field.

The website, however, was never meant to replace the printed magazine.

Budget efficiencies in 2011 led to the elimination of the printing budgets for all flagship service magazines. At that time, the Airman editorial staff and design team were reduced to only one full-time employee each, during a BRAC move that consolidated most flagship service products at the Defense Media Activity here.

 

 

 

One veteran designer, Luke Borland, made the move from San Antonio to Fort Meade and became the chief of the Airman design team. His enthusiasm for the new technologies just coming onto the scene were critical to developing a plan and building a team that could exploit new tools and processes to bring Airman readers the same kinds of stories in new and exciting ways.

“Loss of the printed magazine was a little sad, but not unexpected, and in some ways a pleasant change,” he said. “While some people may miss having the paper in their hands, these new tablet platforms still allow you to hold something, have the same feel, but get so much more out of the product.”

In rebuilding, Borland and his team spent a great deal of time getting a broader understanding of digital design tools to get the most from new multimedia content, learning about publishing to the different digital magazine stores and finding that there are still some limitations.

“There are things we want to do that just are not possible yet with the current publishing tools. But what we have been able to do so far is amazing,” said Borland. “We want to push the technology to its maximum potential and give our readers that immersive experience.”

One thing Borland said he wants Airman readers to know, whether they started reading 50 years ago or yesterday, is that in this new era, the magazine will remain the same trusted source it has always been. “We’re still telling the stories of everyday Airmen and Air Force heroes. It’s a deeper form of the magazine, it’s the added layers to the presentation that just make it a richer experience for the reader, but it holds true to the integrity of Airman.”

While Borland and the design team worked on the technical side to get ready for digital publishing, the creative staff was also working hard to capture stories in new ways and develop ideas that would lend themselves to the new format. To head up the editorial staff, Keres, a former multimedia instructor at the Defense Information School and former staff photographer at the Washington Post, was selected specifically for his vision and ability to inspire great storytelling. He was named editor-in-chief in April.

“From the beginning, Airman has always focused on people, cool aircraft and unique missions of the Air Force. Our staff strives to find better ways to tell those important stories,” he said. “With the new technologies available today, we can tell those stories in a much better way. This new interactive digital format makes storytelling fun. We can push the boundaries of what people have previously thought of as a magazine. Reading Airman is now a personal experience and the reader feels engaged with the subject.”

One of the most important technological advances Keres brought to the Airman team was non-narrative video storytelling using digital single lens reflex, or DSLR cameras. Until recently, most digital cameras used by professional photojournalists shot individual still frames. Now some of them give the reporters and photographers the capability to capture high definition video in the field. In this smaller package, and with the same lenses used to shoot breath-taking still frames, Airman photojournalists have quickly learned new ways to draw readers into the story.

“The great thing about the new platform is the crispness, vividness and life of the imagery,” said six-time Military Photographer of the Year, and Airman senior photojournalist Master Sgt. Jeremy Lock. “Anyone who is a fan of photography, and especially Air Force photography, will notice huge differences in the way these tablet devices are able to showcase visual content.”

The new magazine was previewed over the last couple of weeks, with many readers already downloading it from their app stores. Early adopters and those who follow Air Force and Airman social media outlets were given download links. The staff also sought the opinions of some Air Force and media industry leaders before today’s publication date.

 

“Airman magazine now as an app — what a fantastic way to read these stories and showcase the amazing photography by Air Force photojournalists,” said Linda D. Epstein, McClatchy Tribune Photo Service senior photo editor.

“I love how the magazine comes to me on the iPad — I can read it whenever I want to. I used to glance at the site in my browser but with the new platform, it has drawn me in to read it from cover to cover,” added Epstein.

In the November 2012 issue, the cover story is “Donning History,” which features two Air Force veterans and their expansive collections of vintage Air Force uniforms. Accompanying the story are a multimedia presentation and photo gallery that show Airmen wearing many of these uniforms from our Air Force past. There are other major features, including “Medics in Mayhem,” which covers military medical personnel training in a civilian trauma unit; “Outside the Wire,” about TACP Airmen who live somewhere in between the Air Force and the Army; and “Refueling the Field,” about a KC-135 unit which acts as a flying gas station giving the fleet additional range.

To complement this month’s features, the staff has updated “Around the Air Force,” a standard favorite that shows the great work Airmen are doing around the world and added new departments like “Through Airmen’s Eyes,” where Airmen tell their own stories and “Airman Visuals,” which showcases the Air Force through imagery.

“For me personally, it was an honor to be able to take Airman from the respected printed magazine it was to bring it into the new world of digital publishing and maintain the quality look and feel,” Borland said, “to give the readers new ways to interact, make the content leap off the page and continue to tell the Air Force story in a much better way than we were able to do in the past.”

Airman tells the story of the United States Air Force through images and written stories about the people, missions and equipment that define the nation’s youngest service. The magazine was first published in August 1957, and continued as a printed magazine in various sizes, formats and printing schedules until the final edition, the September-October issue of 2011.

Airman is free and available now for download in the Apple iTunes newsstand at http://bit.ly/TK559c and as a standalone app in the Google Play store at http://bit.ly/WE3FTX .

 

For information about the tablet edition of Airman, visit the website at http://airman.dodlive.mil/ .

 

 

Microsoft starts $40 Windows 8 upgrade sales

 

Some customers run into problems trying to buy, install and activate the new OS

Computerworld

By Gregg Keizer

October 26, 2012 04:17 PM ET

Computerworld – Microsoft today opened its virtual store and began selling upgrades to Windows 8 Pro for $39.99, making good on a promise made last summer.

The upgrade, which must be downloaded and installed via a utility called “Windows 8 Upgrade Assistant,” can be applied to Windows XP-, Vista-, and Windows 7-powered systems.

Microsoft announced the upgrade in July, a bit more than a month after it issued the last of three public previews, but before it wrapped up its work on the new OS.

The upgrade does the most thorough job when migrating a machine from Windows 7, which is the only edition that conducts a “full” upgrade, the label for an upgrade that brings along everything, including data files, user accounts, Windows settings and installed applications.

Windows XP-to-Windows 8 upgrades are the least comprehensive, allowing customers to move only personal files; Vista transfers both Windows settings and personal files to Windows 8.

Although the $39.99 buys only the upgrade’s 2GB digital download, users can optionally request a DVD for an additional $14.99. The combination is about $15 cheaper than the $69.99 DVD-in-a-box that Microsoft and retailers, including Best Buy, Wal-Mart and Amazon, are now selling.

Microsoft has said that customers who downloaded and installed the free “Windows 8 Release Preview” will be able to upgrade to the paid version of the OS, but that those running the free trial of Windows 8 Enterprise RTM, or “release to manufacturing,” which debuted in August, cannot.

That was just one of the problems reported by customers.

In messages on Microsoft’s support forum, several said they had been blocked from upgrading by a confusing error message. “Windows 8 isn’t available for download,” the message stated. “Sorry, Windows 8 isn’t available for online purchase in the country/region you’re in.”

Computerworld ran into the same message when it tried to purchase the Windows 8 Pro upgrade from a virtual machine (VM) running an evaluation copy of Windows 8 Enterprise. The VM was located in the U.S., where the upgrade is currently available, and the attempt was made after Microsoft turned on the upgrade spigot.

Others, however, said that they had been thwarted from upgrading a Windows 7 PC with a legitimate license. “Quite frustrating seeing as many others are downloading it without a problem,” noted someone identified as “_PhAzE_” on a support thread.

Additional complaints were lodged about the upgrade, ranging from machines that refused to download the file, payment problems, and invalid activation keys provided by Microsoft.

Customers can start the upgrade purchasing and installation process by heading to Microsoft’s Web site and clicking the button “Download Pro for just $39.99.”

Microsoft will sell the Windows 8 Pro upgrade at the discounted price until Jan. 31, 2013. It’s not entirely clear what the price will be after the offer expires, but several online retailers selling the $69.99 boxed copy show that price as marked down from a regular price of $199.99.

Customers can upgrade to Windows 8 from XP, Microsoft says, but only some personal files will survice the migration. Users can also select “Nothing” to do a “clean install” that scrubs the hard drive before installing the new OS.

 

Wireless technology vulnerable in emergencies

Washington Post

By Cecilia Kang, Published: November 2

After Hurricane Sandy put entire cities and towns in the dark, people across the East Coast immediately discovered the downside of the emerging technology age: It has become harder to communicate in emergencies.

Over the past decade, consumers have rushed to replace their landlines with cellphones, digital phones and other sleek gadgets. The problem, some analysts say, is that these devices require electricity. So as the power goes, so goes the ability to communicate with friends, family and emergency responders.

From the perspective of these analysts, the digital portrait of America is not encouraging.

About 36 percent of American homes are wireless. Pay phones dwindled to 550,000 in 2009, down from 2.1 million a decade earlier, according to the Federal Communications Commission.

These traditional methods of communication are among the most reliable in an emergency, experts say. Landlines can work even when the electricity is out.

“Copper-wire phone networks were built with stability and dependability in mind, because they were public utilities and were supposed to withstand hurricanes and floods,” said Harold Feld, a senior vice president at public interest group Public Knowledge.

Critics of the telecommunications industry say it is not doing enough to ensure that Americans can communicate when they need to the most.

AT&T, Verizon and other companies have been making a push to invest less in wireline networks, as their popularity has dwindled. They and providers of Internet service pressed the FCC to repurpose an $8 billion fund, originally intended to bring traditional phone lines to rural areas, to finance broadband Internet instead.

Public interest groups have argued that wireless providers should provide backup power for each cell tower to ensure reliable service. But the industry has said that would be too expensive. Instead, they have said they would respond to emergencies by bringing in generators and extra cell towers only when needed.

But Sandy exposed a problem in this plan. The generators need gasoline — a scarce resource in stricken areas of New York and New Jersey because the gas stations didn’t have power to pump fuel.

Overall, in storm-affected regions, 25 percent of cell towers went down Tuesday as the storm hit. About 15 percent remained out Friday morning. Downed trees and flooding kept crews from repairing many towers, companies said.

Wireless carriers said they had made progress. Verizon Wireless, whose downtown Manhattan offices were flooded, said nearly 97 percent of its East Coast customers were able to resume at least some service. T-Mobile and AT&T this week agreed to allow free roaming on each other’s networks in parts of the Northeast to handle the surge of calls.

“While wireless service has yet to return to normal levels, coverage is good,” Verizon said in a statement Friday afternoon. “However, telecom and residual flooding issues continue to be a factor.”

Verizon Communications said its business customers wouldn’t get their FiOs Internet service fully restored for about a week.

Fixed-wire phones, though, are not invulnerable. In Lower Manhattan, flooding cut off traditional phone lines earlier in the week. Pay phones suddenly came back into vogue in parts of the city.

“There will always be disasters that are damaging to any form of communications,” said Brian Fontes, chief executive of the National Emergency Number Association, a trade group that advocates for emergency-responder services. “It’s best not to rely on any one service.”

Advertisements

From → Uncategorized

Comments are closed.

%d bloggers like this: