A talk by Adam Bradley, to the CMD community.
Slow analytics – allowing space for human reflection, contemplation, exploration and discovery.
BIG DATA != DEEP MEANING
Data as an opportunity for interaction.
Factual vs. Rhetorical data/knowing
Metaphors and motivation – what it MEANS
Gap between object and subjective – experiential knowledge
How to not get away from the ART? Distant reading –> too much data, not art.
How to retain the experience of “letting it wash over you“?
Instrumental (DATA) vs. Anthropological (INTERPRETATION/EXPERIENCE)
Rational agent vs. “people are messy” – unmindful application of algorithms strips the human from a thing, when it could be used to enhance it.
Tool development as a critical act – if the possibilities for what can be understood are shaped by the tools that are available, the development of the tool becomes much more important. Metaphors employed? What is measured? What is stored/processed/visualized?
Highly idiosyncratic interaction – personal ways of working that would be prevented by imposing systematic processes. BIG DATA KILLS HUMANITY
“Getting it wrong” is an important part of the sense-making process.
Data approach – Start by presenting all data from a set, without context. OVERLOADING. CHAOS.
Workflow approach – research starts with their own interaction with the dataset, augmented by contextual relations from the rest of the data, and with other sources.
Slow analytics – slow, iterative, methodical.
Presenting ALL of the facts can HINDER SENSE-MAKING!
This is a case study, but a case study of what? of the author? publisher? editor? reader? something else? or is it all of them?
Close reading of poetry, with scaffolding from software that automates visualization of relationships and fetches supplementary information as identified by idiosyncratic annotation.
This is a wonderful articulation of what I’ve been thinking about doing for performances!
How to support/enhance the deeply human and personal experience of being in a work and interpreting it? (rather than just crunching it algorithmically)