LCG’s Gen AI Capabilities

About LCG

For 30 years, LCG, Inc. (LCG) has steadfastly provided IT consulting, modernization, and support services to the federal government. Our diverse team of technologists implement innovations—providing strategic vision, transparent leadership, and experienced service delivery. Our collaboration extends to more than 40 federal agencies, including 21 out of 27 Institutes and Centers (ICs) within the National Institutes of Health (NIH).

Call to Action

Across the globe, governments are at the forefront of shaping the future of GenAI—acknowledging its vast implications and its novel challenges. This pivotal moment in technological history calls for deliberate and strategic action. Policymakers are in the throes of designing robust governance frameworks, aiming to harness AI’s potential responsibly and equitably. These frameworks are regulatory measures and foundational cornerstones for the safe and beneficial integration of AI into our societies.

In this context, federal Office of the Chief Information Officers (O-CIOs) are presented with an unparalleled opportunity. By embracing guided experimentation with Gen AI, O-CIOs can lead by example—preparing their teams for the impending technological evolution and contributing significantly to the development of comprehensive AI governance mechanisms. This proactive approach does more than ready the federal IT landscape for future challenges: it places O-CIOs at the heart of shaping the ethical, responsible, and inclusive deployment of AI technologies.

Transforming Government Operations

Empowering the Federal Government with Generative AI

In the realm of federal information technology, the advent of Generative Artificial Intelligence (GenAI) is not just another wave but a seismic shift—heralding an era of unprecedented innovation. Despite AI being a cornerstone of technological research and development for decades, GenAI emerges as a beacon of transformation. It represents not merely an evolution but a revolution, redefining the paradigms of business operations and value creation.

This transformative technology, bolstered by significant investment and development, is paving the way for its integration into the daily mechanisms of various sectors. Its ability to enhance decision-making processes, streamline operations, and foster innovation positions GenAI as a critical tool in the arsenal of federal IT—ready to redefine the contours of government efficiency and service delivery. Gen AI, when blended with governance and aligned with regulators, will give us efficiencies, ensuring that these advancements are both effective and compliant.

Acknowledging the Implications

GenAI distinguishes itself from previous technological milestones by requiring in-depth examination, despite its widespread applicability. The adoption of Large Language Models (LLMs) into everyday tools marks a significant transformation in how we work by streamlining and enhancing operations in ways previously unimagined. This evolution emphasizes the need for a deep understanding of how GenAI can reshape current business models and unlock new avenues for operational efficiency.

This narrative encourages O-CIOs to adopt a forward-looking stance on GenAI and experiment with its capabilities even before fully fledged regulatory frameworks are in place. The swift pace at which this technology develops and outstrips former technological benchmarks underscores the importance of early engagement. Our guide aims to highlight the criticality of embracing Ethical GenAI adoption and offers a practical roadmap for federal entities poised to pioneer in this dynamic field.

Vision to Reality : LCG’s Journey with Generative AI

As a Microsoft Partner, LCG’s dedication goes beyond mere words—it is a reflection of our commitment to continual innovation and our prominent position as a leading integrator of Microsoft Solutions. Embarking early on the GenAI wave, our venture started with integrating OpenAI Services via Azure OpenAI Services. Our position as an early adopter and thought leader in the GenAI landscape has elevated our status as a trusted advisor to federal Chief Information Officers (CIOs). LCG is instrumental in simplifying GenAI for federal CIOs while emphasizing strategic foresight and readiness for the future. Our initiatives aim at equipping agency leaders for GenAI adoption, marking the beginning of an era ripe for experimental exploration into its capabilities for transformative decision-making and establishing a foundation for future operational enhancements.

Our AI Engineering Team stands at the helm, integrating visionary strategy with industry-leading practices, robust security measures, and compliance with federal Executive Orders, Presidential Directives, and Mandates. Our mission is to empower our Government and Public Service partners to achieve their AI objectives by offering an extensive portfolio of services designed for maximal impact.

How We Can Help

GenAI Awareness Campaigns

LCG’s “GenAI Roadshow,” launched in Quarter 4 of 2023 for our federal customers, demystifies GenAI with an engaging series of discussions and demonstrations. We highlight the transition from “Narrow AI” to GenAI’s broader applications, including practical Azure OpenAI integrations in service desk operations and grants management. Our demonstrations and interactive sessions aim to spark curiosity and encourage exploration among attendees, showcasing real-world GenAI applications.

GenAI Strategic Planning

Leveraging over two decades of experience with federal CIOs, LCG guides O-CIOs through the evolving GenAI landscape from blueprinting to strategy development. Our approach prioritizes evaluation and experimentation, offering tailored guidance on strategic planning, acquisition support, and implementation—empowering CIO Organizations to navigate GenAI integration effectively.

GenAI Development

Our AI Engineering Team leads GenAI development initiatives, enhancing digital ecosystems through large language models like GPT via Azure OpenAI Services. From content creation to decision-making and workflow automation, our controlled pilot projects balance risk and innovation by incorporating AI-assisted tools like GitHub Copilot for efficient and modern software development.

GenAI Integration Services

LCG specializes in integrating GenAI and LLMs into client workflows to refine business processes for efficiency and enhanced customer experiences. Our focus on precise adjustments and optimal resource use supports discovery projects in Grants Management and Customer Service, enabling our clients to fully leverage GenAI for transformative business improvements.

Azure OpenAI Onboarding

Our workshop provides an in-depth introduction to GenAI and Azure OpenAI for NIH use cases. Participants explore GenAI concepts, Azure OpenAI services, and practical applications through live demonstrations and hands-on exercises. This comprehensive session covers everything from setup to custom application development, guided by best practices and tailored adoption pathways for NIH integration.

Use Cases

Ticket Quality Analyzer: Elevating Service Desk Operations

In the realm of Service Desk operations, manual ticket quality assessments present significant challenges, leading to inefficiencies and potential service quality issues. LCG’s Ticket Quality Analyzer, powered by Azure OpenAI, automates the analysis of service desk ticket data, streamlining the process and ensuring adaptability to evolving service standards. This AI-powered solution proactively identifies potential issues, allowing for timely interventions and preventing service disruptions. By shifting the focus from labor-intensive manual tasks to strategic initiatives, the Ticket Quality Analyzer enhances overall service quality and operational efficiency, positioning the Service Desk for future scalability. 

Interactive Knowledge Base: Streamlining Scientific Research and Support

In scientific research and lab support, manually sifting through extensive documentation to extract relevant information is a significant challenge, leading to delays in decision-making and support tasks. LCG’s Interactive Knowledge Base, powered by Azure OpenAI and utilizing the Retrieval-Augmented Generation (RAG) technique, addresses this issue by providing a sophisticated chatbot interface for quick, precise answers. This innovative solution integrates with an agency-provided knowledge base, ensuring a secure FedRAMP High platform that adapts to evolving research needs. By enhancing accessibility to relevant information, this solution reduces time spent on information retrieval, boosts productivity, and fosters collaboration among researchers and support staff, promoting a culture of knowledge-sharing and innovation. 

Compliance Screening of Grants Applications: Enhancing Efficiency and Accuracy

The manual screening of grant applications is a vital but labor-intensive process, consuming significant time and resources. LCG’s Compliance Screening of Grants Applications, powered by Azure OpenAI, addresses this challenge by automating the initial screening step. This innovative solution significantly reduces the time required for compliance checks, mitigating human error and bias. By streamlining the process, it introduces scalability and adaptability to evolving compliance standards. The AI-driven approach enhances workflow efficiency, improves accuracy, and promotes a more equitable and effective grant screening process, aligning with LCG’s strategic focus on resource optimization and risk mitigation. 

Candidate Matching: Revolutionizing Recruitment Efficiency

The modern job market demands a seamless recruitment process, yet manual resume screening against job descriptions is time-consuming and error-prone, potentially overlooking highly qualified candidates. LCG’s Candidate Matching solution, powered by Azure OpenAI, automates the resume matching and scoring process, significantly enhancing recruitment efficiency. This AI-augmented prototype allows users to upload multiple resumes, matches them against job descriptions in a SharePoint repository, and provides detailed alignment scores. With configurable match thresholds, this solution offers flexibility in recruitment criteria. By automating these tasks, it enables recruitment teams to focus on strategic talent acquisition, ensuring fair and objective candidate evaluations, and ultimately fostering a more effective and agile recruitment process.

Chanaka Perera is Chief Technology Officer at LCG.

Embracing AI/Cloud Collaboration Between Industry, Government Consultants, and the Federal Government

LCG Inc. was joined by cloud and AI experts from Microsoft for a Federal CIO Roundtable designed to share ideas and learn about advances in cloud and artificial intelligence (AI). The state-of-the-art virtual conference was held at Microsoft’s new Envision Theatre in Arlington, Virginia.

LCG’s CTO, Chanaka Perera, led the roundtable with experts from Microsoft’s cloud solution and Azure data federal practice in a robust discussion of the responsible application of generative AI (GenAI), a hot topic dominating the IT space. The group of 50+ attendees listened in and interacted with experts regarding security and long-term planning.

These issues and many others were top of mind for the CIOs in attendance. A major theme was GenAI’s great potential to increase productivity despite the fears of AI replacing people’s jobs. Industry consensus is that human oversight and involvement will always be needed to minimize potential negatives such as “hallucinations” and to counteract misunderstandings about the human and machine interface.

Another evolving concern is the trade-off between taking the time to test and evaluate against the pressing need to lead and participate in worldwide advances in the AI space. The White House Executive Order on AI issued October 30, 2023 addresses this same sense of urgency balanced with developing and using AI safely and responsibly.

Highlights from the roundtable include:

Fine-tune your skills at prompt engineering.

To use GenAI effectively, you first need to figure out what you’re looking to solve. For example, you can tell GenAI to:

  • analyze government grants for the most recent fiscal year, OR
  • create a table based on grant allocation over the most recent fiscal year and include recipient agencies, grant amounts, and project objectives.

You still need to check the results. Use GenAI to inform you, but you are the decision-maker.

Welcome your new brainstorming partner.

GenAI is very useful for exploring possibilities. Ask the GenAI model what the best approach would be for solving a particular problem. GenAI is all about stats and probabilities and numbers – it’s not a sentient being. Using GenAI can enable us to consider other options like we do in brainstorming sessions. GenAI can help make this a very quick and powerful process.

Troubleshooting proof of concept.

GenAI can help troubleshoot potential solutions and quickly develop proof of concept. LCG’s Perera agreed, “With Microsoft’s help, LCG is an early adopter of Azure OpenAI. We’re fortunate to be able to test out capabilities and the possibilities in this sandbox so we can best advise and provide counsel to our federal agency clients.”

Enterprise security is here.

OpenAI’s public version of GenAI, for example ChatGPT, cautions against sharing personal or confidential information.*

However, many attendees were surprised to learn that Azure OpenAI is ZeroTrust secure and now included within the US FedRAMP High Authorization for Azure Commercial. Just sign up within your federal cloud space and let your technologists start experimenting with infrastructure in the sandbox.

Be aware of whether or not you are operating within protected spaces like Enterprise and Azure Commercial.

Every day there are developments in the AI space.

For example, note that Bing Enterprise is now a part of Copilot. Below is the response to a question about whether using it is secure.

On November 15, 2023, Microsoft announced that Bing Chat and Bing Chat Enterprise will become known as Copilot, with commercial data protection enforced when any eligible user is signed in with Microsoft Entra ID. You can learn more here.

CONCLUSION

It’s clear the speed of IT innovation is accelerating the development of new capabilities. As GenAI becomes a part of large suites of applications, it will be imperative that industry and the Federal Government collaborate closely to share knowledge, use cases, and best practices.

* Note that this caution is part of LCG’s corporate AI policy for employees to protect confidential and personal information.

NIH SharePoint Modernization

Congratulations to LCG’s team at NIH’s Office of Strategic Coordination (OSC), who diligently worked over three years to modernize and upgrade 35 SharePoint sites from the legacy on-premises SharePoint 2019 platform to SharePoint Online. Our team developed custom interactive and responsive components using SharePoint Framework (SPFx), React, and TypeScript, allowing seamless integration with SharePoint data and services for a personalized and modern user experience.

When modernizing legacy tech, the act of translating old functionality to new frameworks remains essential to delivering on customer needs. The team attempted to make the custom SharePoint Online site more functional, while easing the end-user transition by retaining familiar SharePoint 2019 elements.

The team integrated features such as breadcrumb navigation, created reusable component libraries to modularize and design site architecture, developed User Acceptance Testing (UAT) cases, and addressed challenges such as custom actions not being able to translate 1:1 from SharePoint 2019 to SharePoint Online.

The following team members worked directly on this migration project and received high satisfaction marks from NIH’s OSC:

  • Sreepallavi Thota
  • Raja Ganapathi
  • Rupinder Kaur
  • Sumitra Sampath
  • Fern Wildesen

Ensuring Success in NIDCD’s Return to Physical Workspace Effort

Many workplaces faced new challenges transitioning to virtual workspaces during the COVID-19 pandemic. COVID control methods have changed throughout the past three years, meaning that organizations may now find themselves evolving their workspaces to meet critical goals. LCG teammates played important roles on the Return to Work team at the National Institute on Deafness and Other Communication Disorders (NIDCD), receiving the coveted NIDCD Director’s Recognition award. 

The Return to Work team at the National institutes of Health (NIH) is a program to support the transition back to a physical workspace. Many departments needed to create flexible workspaces to welcome employees back to a hybrid work environment, while also being mindful of the needs of employees. 

LCG’s Return to Work team at NIDCD coordinated and established hoteling spaces, refit client offices for in-person use, and acclimated users to the new hybrid workplace. The following LCG teammates on the Return to Work team received the NIDCD Director’s Recognition:

  • Dan Berger
  • Christopher Adams
  • Brandon Gomez
  • Hemalben Harkhani
  • Naveen Lanke
  • Dagmawi Jiru

Back It Up – The Importance of Redundancy

World Backup Day is March 31st, 2023. It’s a great day to remind ourselves of the importance of backing up personal and professional data. From family photos to important presentations, you’ve likely felt the pain of forgetting to do backups.

“Failure is Always an Option” – Adam Savage

Whether you are a MythBuster or a software developer, there is no shortage of issues that can result in data loss. Software bugs, malware, environmental factors, accidental deletion. Without a robust backup solution, your data is constantly at risk. There are several types of backup options that can mitigate this risk. Let’s take a look at a few of these that can help maintain access to your important files no matter what gets thrown your way. 

3-2-1… Countdown for Successful Data Protection

Having one backup copy of your critical work data is sometimes not enough. Let’s assume you back up your files or your application data to an external drive, and keep this drive in your home office or work desk. If your laptop fails, you have a backup copy for immediate recovery. If a water pipe bursts in your home, or there’s an issue at your desk onsite, then your computer and the external drive backup will likely be destroyed. This is a good reason to keep at least two copies of your critical data with one copy residing in an offsite location. Examples of offsite locations include simply saving to the cloud or even using an infrastructure datacenter whose specialty is housing backup data. 

OneDrive – One terrific offsite solution is Microsoft’s OneDrive. If you have a Microsoft 365 subscription, you may have access to 1 TB or terabyte of OneDrive cloud storage. OneDrive can be used to backup specific data but can also be set up to backup all your local folders such as documents, downloads, and desktop automatically. This “set it and forget it” configuration means your data is stored in the cloud in the event something happens to your PC. This approach has the added benefit of giving you access to your files anytime and anywhere. For more information, simply access Microsoft’s OneDrive backup instructions online.

Even with the redundancy offered by a large company such as Microsoft, it’s a good idea to backup your cloud data and email to a third-party provider for even more offsite availability. Whether you are a web developer at a federal agency or a systems administrator at a business like LCG, our data easily climbs into multiple terabytes of cloud data. Having a robust strategy is critical to personal and business success. 

Remember – your backup strategy is only successful when implemented, so save your work today. 

Hold on let me go save this…

Daniel Trencher is the Systems Administrator in IT and Network Support at LCG

LCG Passes ISO 9001:2015 Surveillance Audit, Recognizing the Quality of its Performance

LCG has successfully passed its ISO 9001:2015 Surveillance Audit of the Quality Management System (QMS) validating a commitment to standardize and maintain the highest level of quality service to customers and verifying its continuous improvement of processes.

In 2022, LCG formalized its continued commitment to quality by forming a Quality Control department which also includes Client Relationship Management (CRM). Before the creation of the Quality Control department, LCG was a small company that was attaining and maintaining quality certifications by organically applying the vision of leadership and maintaining high standards of employees.

Following best practices, the Quality Control department works closely with the Service Delivery team and the Project Management Office (PMO) to ensure customer service quality remains high. At the same time, they work across the company to implement processes on an ongoing process to document and improve service delivery and operations.

Achieving ISO 9001 certification is intensive with annual audits to show the company has practices in place which demonstrate the company’s level of quality. The International Organization of Standards (ISO) provides a library of standards to fulfill which the company then fits to its goals and vision.

The current certification was awarded in 2020 and is maintained every year through a “surveillance audit” during which specific areas of the company are selected for audit. The following year, a different area of the company is audited. A member of the executive leadership team is present throughout the audit.

LCG will undergo a full audit of its 2023 quality performance in 2024.

In the words of Carlene Carter, LCG’s Vice President of Quality & CRM, “We work from the inside out, maintaining a solid internal infrastructure, reducing waste, and increasing productivity. Our Service Delivery teams work on improving our customers’ experiences every day. We monitor our success through customer surveys and CPARs. We also socialize the importance of maintaining high standards, have a company-wide Process Improvement Committee, and even run mock audits with heads of departments.”

The company performed exceptionally well with LCG passing the Surveillance Audit with zero non-conformities at any level.

A Guide to Generative AI and its Potential

Generative AI models aren’t just speculative science fiction anymore. It’s important to be cautiously optimistic in this pivotal moment, as many companies and individuals explore the role that AI models may play in society. As with any game-changing technology, it’s necessary to consider the historical view, as well as to understand our roles in building a brighter, better future. 

The brilliance of the technologies of today’s second machine age is on full display. Generative Artificial Intelligence (AI) is one of the most remarkable innovations in recent years. It has the potential to transform our lives in countless ways – from boosting our productivity to providing personalized healthcare – and even creating virtual assistants that can provide a natural language interface. 

ChatGPT, one of these Generative AI models, appears to be the first step across the transformative threshold for entire industries. Many are excited and many are fearful. It’s important to examine our initial reactions to create a better understanding of this technology and where it may lead. 

Looking back at the industrial revolution and the first machine age in the late 1700s, we find similarities to today’s response. Many critics of industrialization believed that the capitalist system would inevitably replace human labor with machines. Industrialization did in fact replace certain types of labor, but new opportunities for employment materialized; people adapted to the new conditions driven by industry. 

During the industrial revolution, remarkable physical engines augmented the capabilities of the human body. Today, the ongoing computing revolution of the second machine age is focused on augmenting human minds – and Generative AI is the perfect case study. Let’s evaluate it with an open mind.

The cautiously optimistic technologist ethicist in me calls for the following considerations.

First – We all must take the time to make responsible AI. At the very least, we should understand the ethical considerations involved in the development and deployment of AI technology so that we can ensure that they are safe, fair, transparent, and accountable.

Second – What is Generative AI, and what is the big deal? Generative AI is a field of machine learning (ML) technology that learns to guess the correct output from user input by using massive training datasets. It can output content of all types, from textual narratives, music, and even visual art. However, a Generative AI model cannot perform the broad-reaching intellectual tasks that humans perform as its focus is primarily on generating content within a specific medium. The limited scope of current Generative AI models is known as a “Narrow AI,” aka “Weak AI”, compared to the more generalized knowledge pursued by the conceptual “Artificial General Intelligence (AGI),” or “Strong AI” models speculated during the 1950s. 

Third – What does all this mean for society? As with any new disruptive technology a balanced mindset allows for the analysis of potential benefits and risks. We find ourselves in a time of great uncertainty and upheaval with the availability of Generative AI to the general public for free or low cost. 

As we navigate the uncertainty about Generative AI’s role in society, I hope we can approach it with curiosity. Keeping an open mind and seeking ways to harness AI’s power for the greater good will allow us to make an informed decision about its uses and applicability based on numerous capabilities and constraints.

There are several ways to learn and evaluate Generative AI, starting with ChatGPT, a platform immediately available to probe for its abilities and limitations. In addition to ChatGPT, you can explore other Generative AI solutions through resources such as Base10, which provides a wealth of information and tools to help you learn more about the latest trends and developments in the field of Generative AI. 

Advanced users looking to explore AI development, OpenAI APIs, and Azure Open AI Services can find advanced features and tools to help create robust and scalable AI solutions. A recent development is the availability of OpenAI’s ChatGPT and Whisper models to developers through API, ten times cheaper in monetary cost than the existing GPT-3.5 models. These services are designed to integrate with systems and workflows, so you can start building and deploying advanced AI solutions quickly and easily. 

In conclusion, I would like to leave you with this thought. Modern technology will continue to evolve at a rapid pace, with increasingly advanced capabilities becoming accessible to the public. As early adopters, we should work to maximize its positive impact while minimizing any adverse consequences. By doing so, we can set an example for others to follow and help to realize the full promise of Generative AI.

Chanaka Perera is Chief Technology Officer at LCG.

Engineering at All Stages of the Health IT Lifecycle 

This Engineering Week, LCG recognizes the contributions engineering has made to Health IT and biomedical research support.

LCG helps accelerate digital government strategy and programs so that agencies and their critical systems can stay ahead of the curve in a rapidly shifting digital landscape. From initial project conception to finished product, the process of development would be nothing without the expertise that many engineers employ. Innovation is crafted by decisions that shape the pathway from idea to reality. Through the work of LCG’s many engineers, we enable more efficient, productive, and speedier customer transformations. 

For instance, in the design and development of Health IT solutions, engineers at LCG are there at every phase from design and development, to creating interfaces, developing algorithms, and designing databases. Adding on top the existing complexity, the engineering team then integrates systems to be sure that all components are working together seamlessly. 

With many Health IT products outputting large amounts of data, there is also a need for analysis and insight. Through the phases of development, engineers working alongside healthcare professionals can ensure that designs of databases, algorithms, and management of key records allow for robust data analysis. This enables healthcare professionals to make informed decisions about diagnosis, treatment, and care management.

As with any sensitive data, cybersecurity is of utmost importance. All products must be secured to protect patient data from theft in any form. Cybersecurity engineers are responsible for safeguarding vulnerable data and to increase the security profile of the organization. 

Finally, to ensure regulatory compliance and that the solution created in development is viable, engineers must perform rigorous testing, including designing and executing test plans, system verification and validation, and conducting risk assessments.  

At every step, engineers work closely with healthcare professionals following the Federal government’s standards.  By working together to design, develop, and test, engineers are involved at each stage so solutions are guaranteed safe, effective, and fulfill the ever-changing needs of Health IT and biomedical research. With the advent of advanced machine learning and generative artificial intelligence models LCG is testing and incorporating emerging technologies to modernize, integrate, and create whatever our clients need to embrace innovation’s future.

Srinivas Kothuri is Vice President, Innovation and Digital Engineering Services at LCG.

Modernizing Legacy Data: AudBase

Building customer-focused solutions aren’t just buzz words at successful companies – it’s part of the corporate DNA. It starts at the top with strong leadership vision and it multiplies across departments with employees breaking down silos to serve the company and customer missions.

A recent example comes from our client, the National Institute on Deafness and Other Communication Disorders (NIDCD) – one of 27 Institutes and Centers at NIH – where Ravi Kosuri of LCG recently received the Director’s Award for his work on NIDCD’s Audiology project.

NIDCD’s Audiology project was a small part of a very large project the LCG team was working on in 2022.

The larger project had an extensive project management plan with timelines, tasks, and deliverables identified upfront with the customer. An immediate need was identified to provide NIDCD Audiology staff with a way to read and process data from the AudBase system.

The team’s approach was quickly shifted to progressive elaboration by LCG’s project manager, Ashley Stanton. Progressive elaboration is an iterative Agile project management technique that allows for a project plan to evolve as information is gathered. This approach, as well as its short turnaround time, made the AudBase project very unusual from the start.

Even when the pressure was on to rapidly turn the new project, time had to be spent in the early days determining the foundational nature of the challenge. Since the service delivery team was deeply embedded in the project, they worked with customer stakeholders to understand their needs. Multiple technical options and their impact were researched with cost and risk analyses so the team could recommend the path forward. Ravi worked collaboratively with the developer, third-party vendors, and government employees to implement a cloud-based database solution that met NIDCD’s needs on time and under budget.

By conceptualizing the best solution and accomplishing the client’s goals and objectives on the tight timeline, Ravi’s team successfully delivered a:

  • Web service interface (Azure-hosted API interface with AudBase)
  • Data transformation engine (JSON parser)
  • Cloud-based relational database
  • Accessible interface for third-party applications (such as RedCap, QlikSense, or Power BI)

Not only did Ravi receive the NIH Director’s award but the entire LCG team was recognized by the Center for Information Technology (CIT) at the National Institutes of Health (NIH) as one of the pioneers in using the NIH Azure Science and Technology Research Infrastructure for Discovery, Experimentation, and Sustainability (STRIDES) Initiative.

The end result is a new concept and technique for conducting and supporting biomedical and behavioral research and research training in the normal and disordered processes of hearing, balance, smell, taste, voice, speech, and language.

Ravi Kosuri is LCG’s principal architect supporting the life science advancements in biomedical research at NIDCD.

ADVISORY: New NIH Data Management and Sharing Policy Affects CIOs

A new NIH Data Management & Sharing Policy will go into effect January 25, 2023. While the focus is on the new responsibilities of researchers, the policy will have an impact on NIH Chief Information Officers (CIOs). Here is what CIOs need to know.

Scientific data is sufficient quality to validate and replicate research findings, regardless of usage in support of scholarly publications.

First, the policy applies to all research, funded or conducted in whole or in part by NIH, that results in the generation of scientific data.”1

Investigators and institutions are required to:

  • Plan and budget for managing and sharing scientific data generated by NIH funded research.
  • Prepare and submit a Data Management and Sharing (DMS) Plan during the funding application process.
  • Comply with the approved DMS Plan.

For CIOs, the policy triggers new procedures and oversight on an Institute or Center (IC) basis for both extramural and intramural research. CIOs should expect new queries from researchers about available repositories for data sharing. In addition, proactive communication about access and security considerations will prevent future challenges.

Data Management and Sharing Plans

Each NIH Intramural Research Program (IRP) determines the procedures for submitting and managing DMS Plans. CIOs should engage with the appropriate IRP offices to advise and assist.

CIOs supporting extramural research grant application and funding systems will need to add associated features and functionality to capture and store DMS Plans.

Supplemental policy guidance provides details about expected contents for DMS Plans and recommends a length of two pages or less.2 Exhibit 1 summarizes the DMS Plan content expectations.

Exhibit 1. Summary of Data Management and Sharing Plan Content Elements

ElementDescription
Data TypeSummarize data types and amount, may describe data modality, level of aggregation, degree of data processing

Describe scientific data planned for preservation and sharing with reasoning behind ethical, legal, and technical factor decisions

Brief list of metadata, other relevant data, and associated documentations planned for facilitating scientific data interpretation
Related Tools, Software and/or CodeIdentify any specialized tools needed to access or manipulate the shared scientific data for replication or reuse
StandardsDescribe planned standards for application to scientific data and metadata, for example, data formats, data dictionaries, data identifiers, definitions, unique identifiers, and other data documentation
Data Preservation, Access, and Associated TimelinesDescribe plan and timeline for:
Name of repository identified for scientific data and metadata archivalHow the scientific data will be findable and identifiableWhen, and for how long, the scientific data will be available
Access, Distribution, or Reuse ConsiderationsIdentify applicable considerations for subsequent access, distribution, or reuse related to informed consent, privacy and confidentiality protections, control of data derived from humans, regulatory or policy restrictions, other potential limitations
Oversight of Data Management and SharingIdentify institutional oversight roles and responsibilities for monitoring and managing compliance with the documented plan

References

1.       Research Covered Under the Data Management & Sharing Policy | Data Sharing. Accessed October 18, 2022. https://sharing.nih.gov/data-management-and-sharing-policy/about-data-management-and-sharing-policy/research-covered-under-the-data-management-sharing-policy

Top