Duke University is addressing the challenge of integrating AI tools while safeguarding sensitive data. Over the past year, faculty, staff, and students have utilized generative AI, raising important questions about data privacy, including data handling and visibility. To ensure secure experimentation, Duke’s Office of Information Technology collaborated with IT Security, legal, and procurement teams to embed privacy protections within AI systems. This proactive approach aims to alleviate the burden on individuals to assess privacy risks themselves. A standout feature is DukeGPT, an institutional AI platform compliant with Duke’s data stewardship policies, along with a negotiated educational license for ChatGPT Edu, which keeps user data secure. Internal access rules further protect individual interactions, ensuring content is only viewable under defined circumstances. This framework promotes responsible AI exploration, emphasizing that responsible AI encourages confident innovation. For more details on AI tools available at Duke, visit the Office of Informational Technology website.
Source link
Share
Read more