This site uses cookies to improve your experience. To help us insure we adhere to various privacy regulations, please select your country/region of residence. If you do not select a country, we will assume you are from the United States. Select your Cookie Settings or view our Privacy Policy and Terms of Use.
Cookie Settings
Cookies and similar technologies are used on this website for proper function of the website, for tracking performance analytics and for marketing purposes. We and some of our third-party providers may use cookie data for various purposes. Please review the cookie settings below and choose your preference.
Used for the proper function of the website
Used for monitoring website traffic and interactions
Cookie Settings
Cookies and similar technologies are used on this website for proper function of the website, for tracking performance analytics and for marketing purposes. We and some of our third-party providers may use cookie data for various purposes. Please review the cookie settings below and choose your preference.
Strictly Necessary: Used for the proper function of the website
Performance/Analytics: Used for monitoring website traffic and interactions
The funding issues the FBI faces manifest themselves in two primary areas: procurement, and data architecture and IT infrastructure. Problems with data architecture and IT infrastructure, meanwhile, can be chalked up to limited resources and a lack of strategic planning, FBI personnel told the watchdog.
The congressional watchdog noted in its release that the VA has implemented six of its 29 open priority recommendations, including the deployment of an automated data tool used to improve acquisition workforce records and taking steps to modernize the agency’s performance management system across the Veterans Health Administration.
This blog post outlines how credit unions can use AWS to evaluate their compliance with Federal Financial Institutions Examination Council (FFIEC) and National Credit Union Administration (NCUA) requirements. Figure 1 features a high-level architecture for evaluating workloads for FFIEC and NCUA compliance.
These processes rely heavily on domain-specific data ingestion and processing for batch and continuous data sources. PNNL developed Aether as a reusable framework for sharing data and analytics with sponsors and stakeholders. Web applications in Aether are served up through API Gateway using Amazon S3 static hosting.
Five years into its existence, the federal organization charged with helping agencies establish best practices for the use, protection and dissemination of data is a year away from sunsetting and still waiting on the release of White House guidance critical to its advisory mission. Nick Hart, the founder of Data Foundation, a Washington, D.C.-based
To safeguard the integrity of development and evaluation of proposals in the merit review process, this memo establishes guidelines for its use by reviewers and proposers.” NSF is exploring options for safely implementing GAI technologies within NSF’s data ecosystem,” an agency spokesperson told FedScoop.
The new government website evaluation tool , which is called “gov metadata,” was created by Luke Fretwell and his son, Elias, as part of the Civic Hacking Agency, a project focused on technology for the public good. The GSA explains in its GitHub documentation that it focuses on collecting data that is helpful to specific stakeholders).
Margaret Boatner, deputy assistant secretary of the Army for strategy and acquisition reform “We are targeting a couple of really key processes like our test and evaluation processes, and importantly, our cybersecurity processes. We’re going to sustain them for a long period of time.
In the coming years, adversaries will use this evolutionary shift in compute capability to crack the cryptography that today is the bedrock of data security. Adversaries are harvesting encrypted data with an eye toward breaking them as soon as the quantum capability is available. In some sense, the threat is already here.
The aim of the post is to help public sector organizations create customer experience solutions on the Amazon Web Services (AWS) Cloud using AWS artificial intelligence (AI) services and AWS purpose-built data analytics services. Data ingestion – Once the data is prepared, you can proceed to ingest the data.
Efficiency concepts like using lean initiatives and just-in-time delivery to manage inventory have become commonplace for many years and have made individuals like Deming household names in many business schools. Managing Risk Across the Supply Chain Remains Challenging.
With a pre-built or custom framework, it maps your compliance requirements to AWS usage data. Audit Manager collects evidence from multiple data sources as an ongoing process once you create your assessment. Audit Manager then captures AWS Config evaluations as evidence for audits.
Securing OT Environments Starts With Data Advanced technologies can be used to secure OT and make it easier for IT teams to protect systems and networks. Artificial intelligence can help engineers identify network anomalies and analyze data for greater visibility. “AI And that is all about the underlying data,” Edwards said.
31, 2023, the state’s Department of Administrative Services will conduct an annual inventory of all AI-based systems used by state entities, and each year the department must confirm that none of those systems leads to unfair discrimination or disparate impacts. A new Connecticut law , in particular, zeros in on agency AI dependence.
They may struggle to keep up with changing compliance frameworks, as well as strict data privacy and sovereignty rules. For those in the cloud, or on the way there, large data volumes can be problematic, as can the demand for specific tools, controls and policies. Agencies face a range of challenges.
Forecasting satellite usage has been traditionally challenging due to issues like demand spikes, impact of weather on link conditions, and a lack of data science expertise to build machine learning (ML) models to predict bandwidth needs. A Jupyter notebook automates the data import, predictor, and forecast generation steps.
During the past three years, Kinnard has developed an AI use case inventory, highlighting 18 ways AI is used in the department. Her team also manages AI platforms, monitoring who uses them and what data flows through them. Models must be trained to make decisions and in what data means in context. Is My AI a Good Person?
The problem Here’s the catch: quantum computers could also break many of the encryption algorithms we currently rely on to protect sensitive data. Importantly, OMB said agencies should prepare to protect their data from quantum computers trying to break their encryption. It’s mind-boggling!
So for contractors, I think it becomes a challenge of making sure they understand the inventory of the customers that they’re working with and how the economic and funding environment is affecting those particular agencies so they can operate smarter, be more strategic about where they’re competing. They want both.
Do they invest in automation and technology to enhance data-sharing between channels for a seamless omnichannel journey? Historically, Procurement for retail has been largely centralized: Stores request and receive inventory from corporate as needed. Decentralize Procurement. Attack GNFR Spend.
Common Supply Chain strategies, such as just-in-time inventory, exacerbate the impact of shocks. Stockpiling inventory has been the most common response. Supplier risk management too often focuses on evaluating the risk level of each supplier and selecting lower risk ones when possible. Mitigating and Measuring Risk Impact.
Here it is essential to deploy technology that can digitize the complete procurement / AP process , to ensure quality data and access across all processes, and seamless process flows. This is a key reason vendors have rushed to offer full suites and plug any gaps they had. The Supplier Perspective.
This is often attributed to configuration management, total asset inventory, compliance with agency third-party security tools, and agency authorization documentation. The use of Auto Scaling in AWS enables customers to maximize their return on investment (ROI) after migrating to AWS.
According to Sajid Kunnummal, Vice President and CPO at Navistar, supply chain data visibility is the number one driving force behind creating resiliency. “It But because we were proactive, we were able to build inventories ahead of time so that we had enough material in the pipeline before the crisis really hit peak.”.
According to Sajid Kunnummal, Vice President and CPO at Navistar, supply chain data visibility is the number one driving force behind creating resiliency. “It But because we were proactive, we were able to build inventories ahead of time so that we had enough material in the pipeline before the crisis really hit peak.”.
1] The FAR defines cost analysis as “the review and evaluation of any of the separate cost elements and profit or fee in an offeror’s or contractor’s proposal as needed to determine a fair and reasonable price or to determine cost realism….” [2] 18] The Government must not apply hindsight in evaluating cost reasonableness.
The agency is also in the process of evaluating two other sets of algorithms for general encryption and digital signatures “that could one day serve as backup standards,” NIST said. For secure government work and industry areas where security is key, “that data has long-term value,” Crowder said.
In fact, OMB is expected to release an updated governmentwide inventory of AI uses cases on or around Dec. The 2023 inventory showed there were more than 700 AI use cases, but the expectation is that number could easily double or triple given the excitement around generative AI and expanded use of predictive AI.
Currently, these use cases are not listed on the VA’s public inventory, which the department is required to update annually. The VA’s current AI guidance states that “web-based, publicly available generative AI service has been approved for use with VA sensitive data.” Court of Claims) on key personnel requirements.
Agencies must also submit a compliance plan and AI use case inventory. They will also be required to release all data used to develop and test their AI products. Advancing Responsible AI Innovation The Memo encourages responsible advancement of AI innovation within federal agencies. race, age, sex, etc.);
The FY 2024 NDAA also includes the Federal Data Center Enhancement Act, the American Security Drone Act, and the Intelligence Authorization Act for FY 2024. Section 1521 allows the Chief Digital and Artificial Intelligence Officer to access and control any data collected, acquired, accessed or used by any component of the DoD.
Relatedly, the Department is authorized to evaluate risks associated with the likelihood that an IaaS product or provider may be used for malicious cyber-enabled activity, and recommend remediation measures to address such risks. There are more than 100 use cases listed on the VA’s 2023 AI inventory , with more than 40 in an operation phase.
We can see agencies using AI to speed up workflows, improve how the public interacts with federal information, reveal new insights in our data, and improve how we design and deliver programs. AI tools will be subject to rigorous assessment, testing, and evaluation before they may be used. But wait, there’s more!
First, some data points to animate the discussion. 10] Interestingly, based on the data inputs for Deltek’s study, average profits were noticeably different for small, medium, and large contractors—15%, 20%, and 18%, respectively. [11] Further, various Agency FAR Supplements put forth a structured approach for evaluating profit.
The two discuss issues, like data-driven decision-making; the speed of innovation; IT modernization; and increased use of contracting flexibilities. Will a contractor’s SC-GHG be an evaluation factor in a contractor’s procurement proposal? The episode can be found here. It is worth listening to and should prompt action.
President Issues Executive Order to Protect Americans’ Sensitive Data On February 28, President Biden issued an Executive Order (EO) on Protecting American’s Sensitive Data from Foreign Adversaries. SAM currently requires contractors with over $7.5
There is a commonality of risks (eg discrimination, data governance risks) that would be better managed if there was a joined up approach. Section 10.1(b) Section 10.1(b) Section 10.1(b)
When federal officials gathered at the Chief Data Officers Council symposium Thursday, it didn’t draw protesters outside the building with signs calling for more data and evidence in government. Unlike other challenges, such as conservation issues, data and statistics don’t have large bases that can be activated to speak up.
In a nutshell, management is going to inspect, gather information, communicate with stakeholders, perform analyses, evaluate results and procedures, update documents, file reports, and ensure that efforts are documented in case of audit by a Contracting Officer or the U.S. More on that below. So, what does EEO housekeeping look like?
The hallmark of that effort to date had been a proposed rule that would, if finalized, require thousands of federal contractors to inventory, publicly disclose, and, in some cases, seek reductions in GHG emissions (see our prior discussion here ). Will a contractor’s SC-GHG be an evaluation factor in a contractor’s procurement proposal?
The memo strongly emphasizes AI innovation, instructing agencies to build IT infrastructure to support AI, collect data to train AI and evaluate potential applications of generative AI. Notably, challenges with AI inventories were the subject of a major Stanford report published in 2022.
After working with Amazon Web Services (AWS) Partner SoftwareONE to migrate its crisis hotline to Amazon Connect in the AWS Cloud, United Way now boasts enhanced call routing, data collection, and supervisory tools. That also means the service is an intermediary to many other organizations, all of which have their own data and processes.
Among other things, the memo mandates that agencies establish guardrails for AI uses that could impact Americans’ rights or safety, expands what agencies share in their AI use case inventories, and establishes a requirement for agencies to designate chief AI officers to oversee their use of the technology.
The Department of State recently removed several items from its public artificial intelligence use case inventory, including a behavioral analytics system and tools to collect and analyze media clips. OMB has previously stated that agencies “are responsible for maintaining the accuracy of their inventories.”
We organize all of the trending information in your field so you don't have to. Join 5,000+ users and stay up to date on the latest articles your peers are reading.
You know about us, now we want to get to know you!
Let's personalize your content
Let's get even more personalized
We recognize your account from another site in our network, please click 'Send Email' below to continue with verifying your account and setting a password.
Let's personalize your content