More than a year after the administration released its digital strategy to speed adoption of secure mobile devices, agencies are still grappling with standards for vetting the security of internal and commercial mobile apps.
Today, there isn’t a federal standard for securing mobile apps, but government officials are hopeful a process will be created similar to what’s in place for vetting cloud products and services used in the government.
“In order for an app that’s developed by DHS to be put in a DoD app store there’s going to have to be some level of assurance,” said Robert Palmer, director of information assurance at DHS.
The National Security Agency, DARPA, General Services Administration and the National Institute of Standards and Technology are among the agencies playing a key role in federal mobile security.
“We’re heading toward the direction of standards,” said Palmer, who spoke on a panel Tuesday at the Federal Mobile Computing Summit. He said NIST is set to release draft guidelines for testing and vetting mobile apps.
Verifying the identity of mobile users as they access data from their smartphones and tablets is another challenge.
At the Defense Department,” we still believe that the PIV, our identity management cards, are…the network hygiene of mobility,” said DOD’s Mark Norton, who also spoke on the panel. The problem is most of the 3 million cards in use at DoD are not used to log onto mobile devices. Norton said DoD is considering technologies, such as near field communication and micro SD cards to help manage user identity.
He said the department currently has 50 mobile pilots underway to test different use cases for the devices.
An undercover investigation by the General Services Administration’s watchdog office has traced second-hand computer equipment originally costing the U.S. government about $25 million to more than a dozen sham educational organizations and, ultimately, back to one man: Steven Alexander Bolden.
Federal prosecutors in Tacoma, Wash., earlier this month filed fraud charges against Bolden, saying he tricked the government into believing he represented schools and thus was eligible for access to GSA’s Computers for Learning program.
Under the program, agencies, as permitted by law, can transfer surplus computers and technology equipment to schools and nonprofit educational groups.
The investigation, which was reported on last week by the Seattle Post-Intelligencer, began last summer after the IG’s office found 13 nonprofit organizations that received computers through the GSA program. While the groups appeared unaffiliated, they all had ties to Bolden, according to court papers.
“There is probable cause to believe that Bolden engaged in a scheme spanning several years in which he impersonated educational nonprofit organizations into giving him government computers and computer equipment,” prosecutors wrote in an affidavit outlining the probe, which was filed in U.S. District Court in Tacoma, Wash., on May 31.
Charging documents said Bolden received thousands of pieces of computer equipment over the years, keeping it for himself or selling computers through online sales sites such as Craigslist, which was subpoenaed as part of the investigation, records show.
An attorney listed for Bolden listed on the case’s docket did not respond immediately to a phone message Monday.
Last year, following the disclosure that 123,000 Thrift Savings Plan accounts had been hacked, the Federal Retirement Thrift Investment Board launched a wide-ranging assessment of its computer system security.
That “Tiger Team” task force review is now complete, but the board isn’t making the findings public.
Instead, the agency is withholding the entire report on the grounds that disclosure “could reasonably be expected to risk circumvention of the law,” Amanda Haas, a Freedom of Information Act officer with the board, said in a response today to Federal Times’ FOIA request. Haas did not immediately reply to a request for more information on why the board is claiming that particular exemption to the act’s requirement that government records are generally public.
The board began the review after learning early last year that Social Security numbers, addresses and other personal data for the 123,000 account-holders had been stolen from a contractor’s network. The cyberattack actually occurred in 2011, but board officials didn’t learn about it until getting notification from the FBI. The bureau has not announced arrests or charges in the case.
The Tiger Team review was in part intended to identify any computer security gaps and come up with ways to fix them, Greg Long, the thrift board’s executive director, told a Senate subcommittee last July. Long made no mention of law enforcement issues, but acknowledged that–at the time of the attack–the board didn’t have a “breach notification plan” because it lacked the resources to develop one. (Long signed such a plan in June 2012.)
The TSP has some 4.6 million participants, including military personnel, civilian agency employees and U.S. Postal Service workers.
Scott Hodes, a lawyer who was once acting chief of the FBI’s FOIA litigation unit, was not familiar with the report, but said in an interview that the board has to establish a threshold to legally withhold information under the FOIA law enforcement exemption. Even then, parts of the report that don’t meet that threshold must be released, Hodes said.
“They can’t withhold everything.”
The Department of Homeland Security is keeping tight-lipped about the details surrounding the resignation of its former chief information officer, which it says was not prompted by disagreements over authority issues.
In April, Rep. Bennie Thompson, D-Miss., ranking member of the House Homeland Security Committee, sent a letter to DHS Secretary Janet Napolitano asking why the department CIO Richard Spires was placed on voluntary or non voluntary leave, who made the final decision regarding his leave and additional information about the current acting CIO.
In a May 13 response, the department’s assistant secretary for legislative affairs, Nelson Peacock, said personnel and privacy rules prohibit DHS from discussing why Spires took elective leave from the agency and later resigned May 17.
Peacock said Spires was not placed in an administrative leave status because of disagreements concerning his authority as CIO but provided no further details. Concerning acting CIO Margie Graves, Peacock said she is fully qualified to serve in her current role and confirmed that she was hired as a Transportation Security Administration employee in 2003 and was not converted from a consultant position.
In a follow-up letter to DHS this week, Thompson pressed for more details, following the department’s refusal to provide adequate responses. This time, Thompson has asked for a copy of Spires resignation letter; an explanation of why he was placed on leave and who played a role in making that decision; an explanation of who is empowered to make information technology decisions at DHS and Graves’ employment history prior to being named acting CIO.
Amazon Web Services is the latest vendor to pass a rigorous security review for all federal cloud products and services.
So far, only CGI Federal and North Carolina-based Autonomic Resources have completed the Federal Risk and Authorization Management Program (FedRAMP). The governmentwide program was launched in June to standardize security reviews of commercial cloud products and is housed within the General Services Administration.
Under the FedRAMP program, Amazon was granted an Authority to Operate (ATO) by the Health and Human Services Department. This means HHS has certified that Amazon’s GovCloud and regional cloud service offerings meet federal security standards, and the company’s services are authorized for use at HHS. The purpose of FedRAMP is for other agencies to save time and money by using or building on the security review HHS has done.
More than 300 government agencies are currently using Amazon Web Services, Teresa Carlson, vice president of worldwide public sector, said in a statement.
By June 2014, all cloud services and products in use at federal agencies or in an active acquisition process must meet FedRAMP requirements.
Agencies are on the hook to publicly release more digital data in a way that protects citizen’s personal information and does not comprise government security.
One challenge, however, will be determining how that data could be combined with existing public data to identify an individual or pose other security risks to agencies, according to experts speaking at ACT-IAC’s annual Management of Change conference this week.
“The awareness is there, the concern is there, [but] the practice of it is relatively immature,” said Mike Howell, deputy program manager in the Office of the Program Manager of the Information Sharing Environment. “The policy framework around how you prevent inadvertent aggregation of personal identifiable information [and] sensitive information, it’s a known problem. It’s good that people are paying attention, but it becomes incumbent on whoever the aggregator is what they do with that information.”
Howell, whose office falls under the Office of the Director of National Intelligence, highlighted the administration’s recent Open Data policy that refers to this issue as the mosaic effect. The policy memo, released this month, directs agencies to:
Consider other publicly available data –in any medium and from any source-to determine whether some combination of existing data and the data intended to be’ publicly released could allow for the identification of an individual or pose another security concern.
The challenge for many agencies, however, is they’re struggling to understand what data they have let alone what data is already in the public domain.
According to the policy, “it is the responsibility of each agency to perform the necessary analysis and comply with all applicable laws, regulations, and policies. In some cases, this assessment may affect the amount, type, form, and detail of data released by agencies.”
There’s a natural tension between releasing open data and securing it, said Donna Roy, an executive director in the Department of Homeland Security’s Information Sharing Environment Office.
Agencies have been instructed to:
- Collect or create only that information necessary for the proper performance of agency functions and has practical utility.
- Limit the collection or creation of information that identifies individuals to what is legally authorized and necessary for the proper performance of agency functions.
- Limit the sharing of information that identifies individuals or contains proprietary information to what is legally authorized.
The General Services Administration is moving forward with plans to stand up a cloud broker contract for acquiring and managing the performance of federal cloud services.
The Department of Homeland Security is one of two agencies that has committed to testing GSA’s cloud broker model in a pilot program expected to launch this fall, said GSA’s Mark Day. Speaking Monday at the annual Management of Change conference in Maryland, Day said GSA will award one contract to test the concept of a broker model and reevaluate the pilot by year’s end to determine how it could be expanded.
GSA has not yet defined all the services a cloud broker would provide, but the National Institute of Standards and Technology defines a cloud broker as “an entity that manages the use, performance and delivery of cloud services and negotiates relationships between cloud providers and cloud consumers.” Technology research firm Gartner defines cloud brokerage as a business model in which an entity adds value to one or more cloud services on behalf of one or more cloud users.
Some question whether the cloud broker model will add value or end up costing agencies more money. In a Feb. 14 letter to Rep. Doris Matsui, R-Calif., GSA’s Lisa Austin said the cloud broker model could be more effective in creating ongoing competition among cloud providers, rather than awarding single contracts for each cloud service.
“Part of the pilot is really understanding what’s the right role, [and] what’s the right process” for a cloud broker model, Day told Federal Times. ”We think we have an idea, but now we’ve got to test it.”
Day made clear what cloud brokers would not do inherently governmental functions, such as contracting. It isn’t clear to what extent brokers would negotiate services between agencies and cloud service providers, but the hope is that cloud brokers will increase vendor competition and reduce pricing and reduce the complexities of acquiring cloud services and integrating them with existing services.
Roughly 15 agencies are part of the cloud broker discussion, Day said. He would not name the second agency that has committed to testing the broker model because the agency has not announced it publicly.
The challenge for GSA has been attracting business to some of its existing federal contracts, rather than agencies launching their own contracts or using other agencies’ contracts. To garner greater use of its strategic sourcing contracts and future use of its cloud broker contract, GSA is meeting with agencies to determine their commitment to participate in market research and use the contracts, Day said. GSA can better leverage the federal government’s buying power, and vendors have an idea of what’s possible, in terms of business volume on a contract, he said.
On Nov. 27, 2012, at 3:38 p.m., an employee at Insight Systems Corp., which was bidding on a health services contract, submitted a revised quote to two employees inside the U.S. Agency for International Development.
The deadline for doing so was 5 p.m.
The message reached the first of three agency-controlled servers at 3:41 p.m., but then it got stuck. And it wasn’t until 5:18 p.m. that the email reached the first USAID employee, while the second employee didn’t receive the message until 5:57 p.m.
Around the same time, an employee at another company, CenterScope, which was submitting its own revised quote, sent a submission to the same USAID employees at 4:39 p.m., but that email did not reach the intended recipients until 5:15 p.m. and 6:08 p.m., respectively.
Too late, right?
Not according to U.S. Court of Federal Claims Judge Francis Allegra.
In a 22-page opinion released Monday, Allegra rules in favor of both contractors in a recent complaint against USAID.
Aside from calling USAID’s decision to reject the quotes because they were late “arbitrary, capricious and contrary to law,” the ruling — in case you’re interested — provides a road map of a typical email message through a maze of internal servers.
In this case, the emails were received and accepted by the USAID’s internal server, but they got stuck there for a while and weren’t forwarded to the next server because of an internal error.
The delays lasted as long as more than two hours, but none of the messages made it to their final recipients by the 5 p.m. deadline.
Still, USAID sent both contractors letters days later saying their quotes wouldn’t be considered because, after all, late is late.
Allegra disagreed, sharply
He went so far as to say USAID approached the question of the timeliness of electronic submission “with the zeal of a pedantic school master awaiting a term paper.”
He also ruled that couldn’t see any reason why possession of the quotes couldn’t be effectuated through a government computer server any less than through a clerk in a mail room.
In the end, Allegra’s ruling bars USAID from making an award unless it accepts quotes from both contractors.
Or, he ruled, USAID could start all over with a new procurement.
The Defense Information Systems Agency is one step closer to standing up cloud broker services for the Defense Department.
As DoD’s cloud broker, DISA will manage the use, performance and delivery of cloud services and negotiate contracts between cloud service providers and DoD consumers.
DISA announced Tuesday that it has developed a process for gathering and assessing DoD’s cloud computing requirements, evaluating vendors’ cloud offerings against contract requirements and has created a catalog for cloud services. In a June 2012 memo, DoD Chief Information Officer Teri Takai said all DoD components must acquire government or industry-provided cloud services using DISA, or obtain a waiver.
DISA will manage cloud services categorized as low or moderate in terms of potential impact on DoD operations in the event of a disaster or cyberattack. The agency will also ensure that cloud offerings comply with the department’s information assurance and cybersecurity policies.
DISA is using Federal Risk and Authorization Management Program (FedRAMP) standards to vet cloud providers. The security program provides baseline standards to approve cloud services and products for governmentwide use.
By June 2014, all cloud services and products in use at federal agencies or in an active acquisition process must meet FedRAMP requirements.
So, far, CGI Federal and North Carolina-based Autonomic Resources are the only companies that have completed the FedRAMP security reviews. The companies will be the first FedRAMP-approved vendors to host DoD’s public data inside commercial data centers.
DoD approval of these companies to provide commercial cloud services is imminent, according to DISA. Both companies have already seen big business among civilian agencies and have spots on the General Services Administration’s cloud computing contract.
GSA is deciding whether to stand up similar cloud broker services for civilian agencies, which could entail private companies serving as brokers.
Agencies were directed last fall to cut a combined $7.7 billion from their information technology budgets in 2014 and propose ways to redirect those funds for priority projects.
Duplicative investments, failing projects, help desks and contracts for email, desktops and mobile devices are among the areas targeted for cuts, according to budget guidance released by the Office of Management and Budget in August.
Details of the proposed cuts were included in agencies’ budget submission documents and were incorporated into the president’s budget, which is due out Wednesday.
For each agency, cuts will amount to 10 percent of their average annual IT spending from 2010 to 2012. The combined cuts would reduce agencies’ IT budgets from $74.1 billion – the figure in the president’s 2013 budget plan – to $66.4 billion for 2014.
Hardest hit will be the Defense Department, which will see a $3.5 billion reduction; followed by the Health and Human Services Department, $662 million; and the Department of Homeland Security, $587 million.
Agencies must propose to OMB how they would reinvest at least 5 percent of that money in priority areas that align with administration initiatives such as:
* Cloud First, which requires agencies to use cloud computing technologies when a reliable and cost-effective solution exists.
* Shared First, an effort to share common IT services within agencies and ultimately across agencies.
* The Digital Government Strategy, aimed at providing better online services to citizens and making government data available in standard, digital formats.
Agencies must propose reinvestment projects that will show a return on investment within 18 months, according to OMB’s guidance. OMB will then decide whether to approve those plans. Projects can include:
* Improved citizen services or administrative efficiencies.
* Shared services.
* IT consolidation, including data center consolidation.
* Improved IT security and information assets.
* Improved energy efficiency of IT facilities and equipment.
* Innovative investments such as cloud computing, modular development, improper-payment reduction and digital government.
* Data analytics or data management consistent with administration priorities.
Chief information officers are also contending with across-the-board cuts, which took effect last month and total $85 billion governmentwide.
“Cuts like this require hard choices,” said Roger Baker, former CIO at the Veterans Affairs Department. If a program is facing a 9 percent cut, agencies have to decide what they can and cannot get done.
Baker, who now serves as chief strategy officer for Virginia-based Agilex Technologies, suggested CIOs prioritize what they can get done with their remaining funding, rather than trying to fund everything with a reduced budget.
At VA, there is a prioritized unfunded list for key projects that are next in line for funding, Baker said. A departmentwide team agrees on projects and submits those recommendations to an IT leadership board. The project list is then approved by the deputy secretary.
The issue for most agencies is they can’t move funding across different projects, he said.
Whether OMB will allow agencies to reinvest some or all of their savings is unclear, but Baker said software license spending is one area ripe for savings.
Agencies are better prepared to negotiate pricing when they know what software licenses they are using and how many. Over the past five years, VA has saved about $200 million on software licenses by purchasing only what is needed.
“Typically, what happens is in the year you make the optimization you get to keep the dollars, but there is no guarantee where federal budget is concerned,” Baker said.