Do the ‘Cite’ Thing, Part II

Do the ‘Cite’ Thing, Part II

In an earlier post, I surveyed differing approaches to issues of sourcing, transparency and ‘information integrity’ in generative AI platforms. The underlying questions there dealt with how AI tools handle sourcing and citation within their own algorithmically generated output. A second distinct but related challenge deals with developing new norms around transparency and disclosure as it relates to human use of AI tools in the human authorship of work—which is to say our work, and the work of our students.

Do the ‘Cite’ Thing: Collisions Between Humans, AI Chatbots, and Citation

Do the ‘Cite’ Thing: Collisions Between Humans, AI Chatbots, and Citation

Several recent conversations with colleagues at NMC have me mulling over emergent challenges at the intersection of Generative AI and academic practices regarding sourcing and citation. Among these challenges, two seem most prominent. Both involve preserving the integrity of a chain of information—what came from where, or who contributed what—but in slightly different ways.