Federal Times Blogs
The Marine Corps is testing new capabilities it hopes will cut mobile computing costs in half.
The service is working with Verizon, Sprint and AT&T on a small beta program to test the feasibility of wireless carriers managing the security of mobile devices, based on Marine Corps policies and standards. The devices will be managed using a dual persona solution, which will allow the carriers to manage government data and applications but not personal use of the phone by military and civilian users.
“If the beta goes well and we prove the technical requirements that need to be employed, then we will move into the pilot,” said Rob Anderson, chief of Command, Control, Communications and Computers- Vision & Strategy Division at Marine Corps Headquarters.
The pilot will include about 500 users in the northern Virginia area, but the Marine Corps hasn’t determined if the pilot will use personal or government devices. If successful, the pilot will be expanded across the military service and serve as the foundation for a bring-your-own-device (BYOD) program.
Meanwhile, the Marine Corps is also testing how capabilities offered by the wireless carriers stack up to a mobile device management solution offered by the Defense Information Systems Agency. DISA is testing a mobile device management (MDM) solution provided by Good Technologies.
“We want to compare both pilots,” said Anderson, who spoke at a mobile computing summit Tuesday. He said the Marine Corps will compare the cost of DISA managing mobile devices versus the wireless carriers and consider user feedback from both pilots.
“We are keeping all options open,” he said. “Whatever the most cost efficient is, [that's] the way we will go. Money is going to drive this train.”
The Labor Department wants to make it easier for consumers to track which businesses are treating their workers fairly.
Labor announced an app development contest Tuesday that it hopes will “help empower consumers to make informed choices about where to bring their business,” according to an agency news release.
The smartphone app will include Labor’s publicly available enforcement data, data from consumer ratings and geopositioning websites and other data available through state health boards.
“The app could also prove a useful tool for job seekers and for companies that are deciding which firms they may want to do business with,” Laura Fortman, principal deputy administrator of the Wage and Hour Division, said in a statement. ” It could also help individuals get in touch with the Labor Department if they have any questions.”
Contest details will be posted on Challenge.gov, and developers have until Oct. 11 to submit their app to the department.
Tags: mobile app
More than a year after the administration released its digital strategy to speed adoption of secure mobile devices, agencies are still grappling with standards for vetting the security of internal and commercial mobile apps.
Today, there isn’t a federal standard for securing mobile apps, but government officials are hopeful a process will be created similar to what’s in place for vetting cloud products and services used in the government.
“In order for an app that’s developed by DHS to be put in a DoD app store there’s going to have to be some level of assurance,” said Robert Palmer, director of information assurance at DHS.
The National Security Agency, DARPA, General Services Administration and the National Institute of Standards and Technology are among the agencies playing a key role in federal mobile security.
“We’re heading toward the direction of standards,” said Palmer, who spoke on a panel Tuesday at the Federal Mobile Computing Summit. He said NIST is set to release draft guidelines for testing and vetting mobile apps.
Verifying the identity of mobile users as they access data from their smartphones and tablets is another challenge.
At the Defense Department,” we still believe that the PIV, our identity management cards, are…the network hygiene of mobility,” said DOD’s Mark Norton, who also spoke on the panel. The problem is most of the 3 million cards in use at DoD are not used to log onto mobile devices. Norton said DoD is considering technologies, such as near field communication and micro SD cards to help manage user identity.
He said the department currently has 50 mobile pilots underway to test different use cases for the devices.
An undercover investigation by the General Services Administration’s watchdog office has traced second-hand computer equipment originally costing the U.S. government about $25 million to more than a dozen sham educational organizations and, ultimately, back to one man: Steven Alexander Bolden.
Federal prosecutors in Tacoma, Wash., earlier this month filed fraud charges against Bolden, saying he tricked the government into believing he represented schools and thus was eligible for access to GSA’s Computers for Learning program.
Under the program, agencies, as permitted by law, can transfer surplus computers and technology equipment to schools and nonprofit educational groups.
The investigation, which was reported on last week by the Seattle Post-Intelligencer, began last summer after the IG’s office found 13 nonprofit organizations that received computers through the GSA program. While the groups appeared unaffiliated, they all had ties to Bolden, according to court papers.
“There is probable cause to believe that Bolden engaged in a scheme spanning several years in which he impersonated educational nonprofit organizations into giving him government computers and computer equipment,” prosecutors wrote in an affidavit outlining the probe, which was filed in U.S. District Court in Tacoma, Wash., on May 31.
Charging documents said Bolden received thousands of pieces of computer equipment over the years, keeping it for himself or selling computers through online sales sites such as Craigslist, which was subpoenaed as part of the investigation, records show.
An attorney listed for Bolden listed on the case’s docket did not respond immediately to a phone message Monday.
Last year, following the disclosure that 123,000 Thrift Savings Plan accounts had been hacked, the Federal Retirement Thrift Investment Board launched a wide-ranging assessment of its computer system security.
That “Tiger Team” task force review is now complete, but the board isn’t making the findings public.
Instead, the agency is withholding the entire report on the grounds that disclosure “could reasonably be expected to risk circumvention of the law,” Amanda Haas, a Freedom of Information Act officer with the board, said in a response today to Federal Times’ FOIA request. Haas did not immediately reply to a request for more information on why the board is claiming that particular exemption to the act’s requirement that government records are generally public.
The board began the review after learning early last year that Social Security numbers, addresses and other personal data for the 123,000 account-holders had been stolen from a contractor’s network. The cyberattack actually occurred in 2011, but board officials didn’t learn about it until getting notification from the FBI. The bureau has not announced arrests or charges in the case.
The Tiger Team review was in part intended to identify any computer security gaps and come up with ways to fix them, Greg Long, the thrift board’s executive director, told a Senate subcommittee last July. Long made no mention of law enforcement issues, but acknowledged that–at the time of the attack–the board didn’t have a “breach notification plan” because it lacked the resources to develop one. (Long signed such a plan in June 2012.)
The TSP has some 4.6 million participants, including military personnel, civilian agency employees and U.S. Postal Service workers.
Scott Hodes, a lawyer who was once acting chief of the FBI’s FOIA litigation unit, was not familiar with the report, but said in an interview that the board has to establish a threshold to legally withhold information under the FOIA law enforcement exemption. Even then, parts of the report that don’t meet that threshold must be released, Hodes said.
“They can’t withhold everything.”
The Department of Homeland Security is keeping tight-lipped about the details surrounding the resignation of its former chief information officer, which it says was not prompted by disagreements over authority issues.
In April, Rep. Bennie Thompson, D-Miss., ranking member of the House Homeland Security Committee, sent a letter to DHS Secretary Janet Napolitano asking why the department CIO Richard Spires was placed on voluntary or non voluntary leave, who made the final decision regarding his leave and additional information about the current acting CIO.
In a May 13 response, the department’s assistant secretary for legislative affairs, Nelson Peacock, said personnel and privacy rules prohibit DHS from discussing why Spires took elective leave from the agency and later resigned May 17.
Peacock said Spires was not placed in an administrative leave status because of disagreements concerning his authority as CIO but provided no further details. Concerning acting CIO Margie Graves, Peacock said she is fully qualified to serve in her current role and confirmed that she was hired as a Transportation Security Administration employee in 2003 and was not converted from a consultant position.
In a follow-up letter to DHS this week, Thompson pressed for more details, following the department’s refusal to provide adequate responses. This time, Thompson has asked for a copy of Spires resignation letter; an explanation of why he was placed on leave and who played a role in making that decision; an explanation of who is empowered to make information technology decisions at DHS and Graves’ employment history prior to being named acting CIO.
Amazon Web Services is the latest vendor to pass a rigorous security review for all federal cloud products and services.
So far, only CGI Federal and North Carolina-based Autonomic Resources have completed the Federal Risk and Authorization Management Program (FedRAMP). The governmentwide program was launched in June to standardize security reviews of commercial cloud products and is housed within the General Services Administration.
Under the FedRAMP program, Amazon was granted an Authority to Operate (ATO) by the Health and Human Services Department. This means HHS has certified that Amazon’s GovCloud and regional cloud service offerings meet federal security standards, and the company’s services are authorized for use at HHS. The purpose of FedRAMP is for other agencies to save time and money by using or building on the security review HHS has done.
More than 300 government agencies are currently using Amazon Web Services, Teresa Carlson, vice president of worldwide public sector, said in a statement.
By June 2014, all cloud services and products in use at federal agencies or in an active acquisition process must meet FedRAMP requirements.
Agencies are on the hook to publicly release more digital data in a way that protects citizen’s personal information and does not comprise government security.
One challenge, however, will be determining how that data could be combined with existing public data to identify an individual or pose other security risks to agencies, according to experts speaking at ACT-IAC’s annual Management of Change conference this week.
“The awareness is there, the concern is there, [but] the practice of it is relatively immature,” said Mike Howell, deputy program manager in the Office of the Program Manager of the Information Sharing Environment. “The policy framework around how you prevent inadvertent aggregation of personal identifiable information [and] sensitive information, it’s a known problem. It’s good that people are paying attention, but it becomes incumbent on whoever the aggregator is what they do with that information.”
Howell, whose office falls under the Office of the Director of National Intelligence, highlighted the administration’s recent Open Data policy that refers to this issue as the mosaic effect. The policy memo, released this month, directs agencies to:
Consider other publicly available data –in any medium and from any source-to determine whether some combination of existing data and the data intended to be’ publicly released could allow for the identification of an individual or pose another security concern.
The challenge for many agencies, however, is they’re struggling to understand what data they have let alone what data is already in the public domain.
According to the policy, “it is the responsibility of each agency to perform the necessary analysis and comply with all applicable laws, regulations, and policies. In some cases, this assessment may affect the amount, type, form, and detail of data released by agencies.”
There’s a natural tension between releasing open data and securing it, said Donna Roy, an executive director in the Department of Homeland Security’s Information Sharing Environment Office.
Agencies have been instructed to:
- Collect or create only that information necessary for the proper performance of agency functions and has practical utility.
- Limit the collection or creation of information that identifies individuals to what is legally authorized and necessary for the proper performance of agency functions.
- Limit the sharing of information that identifies individuals or contains proprietary information to what is legally authorized.
The General Services Administration is moving forward with plans to stand up a cloud broker contract for acquiring and managing the performance of federal cloud services.
The Department of Homeland Security is one of two agencies that has committed to testing GSA’s cloud broker model in a pilot program expected to launch this fall, said GSA’s Mark Day. Speaking Monday at the annual Management of Change conference in Maryland, Day said GSA will award one contract to test the concept of a broker model and reevaluate the pilot by year’s end to determine how it could be expanded.
GSA has not yet defined all the services a cloud broker would provide, but the National Institute of Standards and Technology defines a cloud broker as “an entity that manages the use, performance and delivery of cloud services and negotiates relationships between cloud providers and cloud consumers.” Technology research firm Gartner defines cloud brokerage as a business model in which an entity adds value to one or more cloud services on behalf of one or more cloud users.
Some question whether the cloud broker model will add value or end up costing agencies more money. In a Feb. 14 letter to Rep. Doris Matsui, R-Calif., GSA’s Lisa Austin said the cloud broker model could be more effective in creating ongoing competition among cloud providers, rather than awarding single contracts for each cloud service.
“Part of the pilot is really understanding what’s the right role, [and] what’s the right process” for a cloud broker model, Day told Federal Times. ”We think we have an idea, but now we’ve got to test it.”
Day made clear what cloud brokers would not do inherently governmental functions, such as contracting. It isn’t clear to what extent brokers would negotiate services between agencies and cloud service providers, but the hope is that cloud brokers will increase vendor competition and reduce pricing and reduce the complexities of acquiring cloud services and integrating them with existing services.
Roughly 15 agencies are part of the cloud broker discussion, Day said. He would not name the second agency that has committed to testing the broker model because the agency has not announced it publicly.
The challenge for GSA has been attracting business to some of its existing federal contracts, rather than agencies launching their own contracts or using other agencies’ contracts. To garner greater use of its strategic sourcing contracts and future use of its cloud broker contract, GSA is meeting with agencies to determine their commitment to participate in market research and use the contracts, Day said. GSA can better leverage the federal government’s buying power, and vendors have an idea of what’s possible, in terms of business volume on a contract, he said.
On Nov. 27, 2012, at 3:38 p.m., an employee at Insight Systems Corp., which was bidding on a health services contract, submitted a revised quote to two employees inside the U.S. Agency for International Development.
The deadline for doing so was 5 p.m.
The message reached the first of three agency-controlled servers at 3:41 p.m., but then it got stuck. And it wasn’t until 5:18 p.m. that the email reached the first USAID employee, while the second employee didn’t receive the message until 5:57 p.m.
Around the same time, an employee at another company, CenterScope, which was submitting its own revised quote, sent a submission to the same USAID employees at 4:39 p.m., but that email did not reach the intended recipients until 5:15 p.m. and 6:08 p.m., respectively.
Too late, right?
Not according to U.S. Court of Federal Claims Judge Francis Allegra.
In a 22-page opinion released Monday, Allegra rules in favor of both contractors in a recent complaint against USAID.
Aside from calling USAID’s decision to reject the quotes because they were late “arbitrary, capricious and contrary to law,” the ruling — in case you’re interested — provides a road map of a typical email message through a maze of internal servers.
In this case, the emails were received and accepted by the USAID’s internal server, but they got stuck there for a while and weren’t forwarded to the next server because of an internal error.
The delays lasted as long as more than two hours, but none of the messages made it to their final recipients by the 5 p.m. deadline.
Still, USAID sent both contractors letters days later saying their quotes wouldn’t be considered because, after all, late is late.
Allegra disagreed, sharply
He went so far as to say USAID approached the question of the timeliness of electronic submission “with the zeal of a pedantic school master awaiting a term paper.”
He also ruled that couldn’t see any reason why possession of the quotes couldn’t be effectuated through a government computer server any less than through a clerk in a mail room.
In the end, Allegra’s ruling bars USAID from making an award unless it accepts quotes from both contractors.
Or, he ruled, USAID could start all over with a new procurement.