Federico and John pick two apps and discuss how and why they use them for their work. For the first installment of Pick 2, Federico covers Ulysses and how he’s used it over the past year and for the iOS 11 review he is currently writing; John explains how he uses FullContact to keep in touch with developers and sponsors of MacStories and AppStories.
Relay FM co-founder Stephen Hackett discusses his origin story, including the personal challenges he first encountered when he went independent.
Hicksdesign things you see on screens, such as icons, interfaces and identities.
Piece by piece, the world is moving onto the web. "Things informationalize," as Stamen advisor Ben Cerveny puts it. How can we make sense of this new torrent of information emerging wide-eyed and blinking into the internet? Stamen’s Michal Migurski will show how information visualization is making it possible to comprehend a live, vast, and deep connected web of data, with a special focus on interactive and geographic work.
Stamen partner Michal Migurski leads the technical and research aspects of Stamen’s work, moving comfortably from active participation in Stamen’s design process, designing data, prototyping applications, to creating the dynamic projects that Stamen delivers to clients.
Ben Cerveny is a strategic and conceptual advisor to Stamen, helping to articulate an approach toward creative visualization and to evaluate and develop potential partners and engagements relative to that vision.
"SEVENEVES at The Interval reading and signing": A special daytime talk by celebrated speculative fiction authorâ¨ Neal Stephenson on the occassion of his just released novel "SEVENEVES". After a reading, Long Now co-founder Stewart Brand joins Neal to discuss the research and writing of the new book, plus a little bit about what is coming next.
Aleks Krotoski asks if we are haunted by our technology, or are we haunting it?
So much of our experience of technology can feel a bit like being haunted. It starts like any good ghost story with the just mildly unsettling; things aren’t were you left them or seem to have moved on their own within our devices. Its a creepy feeling that leaves you unsure about what to believe. Our understanding of how much of technology works is so limited that when it starts to behave out of the ordinary we have no explanation. This is when we start to make very peculiar judgement’s; "why did you do that" we plead, as if some hidden force was at work.
For some these feelings of being haunted by our technology can develop into full blown apparitions; keen gamers frequently experience Game transfer Phenomena where they literally see images of their game play in the real world, an involuntary augmented reality. While the hallucinations aren’t necessarily distressing in themselves the experiences can leave individuals questioning their sanity.
The coming internet of things will bring problems of its own; smart locks that mysteriously open by themselves for example as if under the influence of some poltergeist. Aleks herself has had the experience of digital ‘gas lighting’ (a term drawn from an Ingrid Bergman movie of a woman being driven mad by husband) when her partner logged on to their home automation system remotely and started to mess with the lights while Aleks was home alone. As one commentator puts it in a reworking of the old Arthur C. Clarke quote "any sufficiently advanced hacking is indistinguishable from haunting."
And as our devices and appliances increasingly start talking to each other bypassing us altogether who’s to say we, like Nicole Kidman’s character in The Others, haven’t become the ghost in the machine.
We know we can’t design meaning directly, but as IAs we can certainly facilitate it. This session delves into theory to uncover the nature of meaning so that we may recognize new structural properties of information that not just facilitate the emergence of meaning, but maintain its evolving thread in networks, cross-channel ecosystems, the Internet of Things, and other complex contexts.
Embodied cognition – the notion that meaning is not a clean, logical process inside the brain, but emerges as we act with information (physical and digital) in the environment – is offered by many IA thinkers to inform our work. But, we haven’t yet fully characterized the nature of meaning. We’ll do that in this session. We’ll visualize the way meaning emerges and evolves as an information-behavior coupling, and how this implies that meaning is not a static recognition, but a flow. Flows have properties like texture, viscosity, permeability. As IA practitioners, we may use structure to modify these properties. We’ll see that the phase-space of IA affects viscosity, facets of linguistic and perceptual information affect texture, and understanding factors affect permeability.
Recognizing these properties of meaning, and the IA structures that stand to modify them, we may make deliberate design choices in our projects.
Visual breakdown of the nature of meaning as a flow.
How to recognize new structural properties of IA that help us facilitate the flow of meaning: the phase-space of IA to modify viscosity, facets of linguistic and perceptual information to modify texture, and understanding factors to modify permeability.
How to use these new structural properties as design tools in our projects.
What We Mean by Meaning: New Structural Properties of IA [ 40:05 ] Play Now | Play in Popup | Download
What we mean by meaning: new structural properties of information architecture IAS15
from Marsha Haverty
Brought to you by
The 2015 IA Summit podcasts were recorded and produced by the fantastic team at UIE. UIE is a research and training company that brings you the latest thinking from the top experts in the world of User Experience Design. UIE’s virtual seminars allow you to get your hands on that information, to absorb as much as you can, on your schedule. Of course, you can keep up with all the shenanigans by signing up for UIE’s free newsletter, UIEtips.
Marsha Haverty: I want to talk about how I came to Information Architecture. I started in grad school being exposed to some of these concepts.
Vannevar Bush, and Doug Engelbart talking about augmenting intellect, machining associativity, and hypertext theory, Ted Nelson and others, information visualization, information seeking behavior — this amalgam of concepts let me understand for the first time that you can think about information abstracted from subject matter. That information can have structure, it can have perceptual properties, and it can have behavior. That understanding, I just needed to do something with information.
I got a job as an Information Architect. I went to the first Information Architecture Summit, and I’m lofting some of the things that Lou Rosenfeld talked about, because they’re still so prescient.
I went to the next Information Architecture Summit and contributed a paper to the IA Special Topic Issue from JASIST. I was really interested in information architecture as a new field without its own internal body of theory. What did that mean? What was that like?
Eleven years flash before my eyes. Then, I finally came back to the summit in 2013 to find information architecture described as the structural integrity of meaning across context, Jorge Arango, and to encounter this thing called "embodied cognition," that I have never heard off, with Andrew Hinton [indecipherable 0:01:36] .
Last year, I brought a poster on some data visualization techniques and here we are. Now, I’m at Autodesk and I held mechanical designers, collaborate around 3D geometry for product design.
If we say that information architecture worries about the structural integrity of meaning across context, the spirit of this talk is to really take a deep dive and zoom in on the nature of information and the nature of meaning, but it matters how we look at it.
For this talk, we’re going to look through the lens of embodied cognition. Traditional cognition says that our senses detects stuff, and that stuff get sent to our brain, and our brains do all of the work of deriving meaning and understanding.
Embodied cognition says that it’s actually our bodies acting out in the world directly with information that participates in understanding and deriving meaning. It’s expanding the boundary of where meaning emerges from just the brain to the brain and the body acting out in the world.
The spirit of this talk is still to say, if we look through the lens of embodied cognition, we can actually see new structural properties of information architecture that we can’t see if we just look through traditional cognition.
Before we get to the new structural properties of information architecture, we’re going to zoom in, and try to visualize the nature of meaning, through the lens of embodied cognition. To do this, we’re going to build the scene. For the scene, we’re going to need the sun. We’ll stylize that, and move it up there. We’ll add to our scene a tree in nature and a built chair.
The sun radiates light down on all the things. All of the things absorb and reflect the light. We end up with what we know as the ambience light around us. James J. Gibson, the founder of ecological psychology in the 1960s, calls this the ambient energy array.
From this ambient energy array, he says, "We pick up surfaces, edges and textures. It’s not these things themselves, because the light shifts. We move around objects. Objects move around us." It’s the relationships among surfaces, edges and textures that we’re picking up from this ambient energy array. This comes in the form of invariant structure.
JJ Gibson says, "Invariant structure is information. That’s what information is." By way of an example, we all know that we don’t need to have seen a chair from every possible perspective to recognize it’s the same chair. We recognize invariances in the relationships among the edges, surfaces, and the textures, that make up the seat, legs and the back of the chair.
Back to our scene, let’s add an actor-observer. We’ve already seen that objects radiate information and terms in the form of invariance structure, relationships among surfaces, edges and textures. The actor-observer brings to this her goals, her actions. It’s this confluence of goals, actions and information, where meaning emerges.
Meaning emerges in this confluence. To rule that up into definition, meaning emerges at the confluence of a goal-directed actor-observer, engaging with information directly in the environment. Back to our scene.
Information about objects is one type of information. The way the objects are arrayed in a layout gives us information visually about way-finding and layout.
We’re also instrumented for mechanical information, touch, cold, [indecipherable 0:05:48] pain, and chemical information, tastes and scent. All of these are perceptual information, and there are others, too. We’re also instrumented for linguistic information, so words on a surface. These days, they can be a physical or digital overlay, words through the air that can be spoken or projected.
Gestures, we can evoke concepts with our motions, and even our own introspection. All of these things contribute to the environment of information that we encounter.
We’ve talked about ware of where meeting emerges. Let’s zoom in and talk about what is this confluence. What is this thing? For perceptual information, we can say that the invariance structure is actually affordances, these things that we are able to coordinate our behavior to, to engage with this information.
From embodied cognition, this is called a perception-action coupling. An example of a perception-action coupling that’s often cited in this field is the outfielder problem. How does a baseball outfielder nowhere to go, to catch a ball? You could think that the outfielder looks at the initial trajectory of the hit, makes some calculation and knows where to go stand and goes there, and catches the ball.
That’s not what happens. What happens is the outfielder forms a perception-action coupling with an angle relationship to the ball. If the ball is coming straight at the outfielder, and then if the ball starts to veer left or veer right, and suddenly there’s a horizontal angle, the outfielder just needs to see that there’s a horizontal angle, and move to eliminate the angle.
It’s a very simple geometric relationship. This perception-action coupling is well down to eliminate horizontal angles. There’s a component for vertical, too, so I’m simplifying. You cannot get the point that in order to catch a ball, it’s just a series of course corrections, to maintain this angle relationship. That’s why outfielders appear to drift, and they end up in the right spot, to catch the ball. They’re just participating in this coupling and maintaining it over time.
Language. Light does not reflect off of concept. What is that thing that handled, that lets us interact with meaning of concepts? You don’t have a name for that. I ask all the embodied cognitive psychologists. We don’t have a name for that, [laughs] unfortunately. What do we consider this coupling? We don’t know yet.
That would be interesting, to see how we formulate that. We can participate in that conversation. Sabrina Golonka, who is an embodied cognitive psychologist, rules up both the coupling reformat perceptual information and the one we form at language into an information behavior coupling.
An information behavior coupling is how and where meaning emerges in this view. But things aren’t staying the same. Our goals, our actions and the information are changing over time. If we’re going to maintain this coupling, as our goals change, the information is at change. As the information changes, our goals need to change. It’s this co-evolution, if we’re going to maintain this meaning coupling.
Further, we could even say that human cognition is a state space of information behavior couplings that form break, co-evolve with our goal-directed action and environment dynamics. We’re not just engaging with one coupling, and that’s our entire life. We’re forming and breaking them all the time. Some are little points engagements, and others needed to last a lot longer.
What does all these flacks and co-evolution say about the nature of meaning? The nature of meaning then is flow. Flows have properties. Flows can have a viscosity or ease of flow. Flows can have texture or facets to the components that’s making up the content of the flow. Flows are subject to permeability in what they might be passing through.
I’m going to talk about different ways we can use new information architecture structures, to dial in these properties, to suit the type of flow of meaning that we’re looking for. We’re going to start with viscosity, or the ease of flow of meaning. I want to introduce a new information architecture construct, to account for viscosity.
In order to do this, we’re going to ask the question, "Is information like water?" If we think of the two things that affect water, it’s pressure and temperature. If we run through all the permutations of different combinations of pressure and temperature, we end up with the familiar phase, states of water, solid, liquid and gas. We can phase shift. We can melt a solid to a liquid. We can freeze a liquid to a solid, and so forth. It’s drastically different to know water if you’re in the neighborhood where it’s a solid, versus a liquid or gas.
What I wanted to do is ask the question, what if we do the same thing with different relative amounts of linguistic and perceptual information? What do we get? When we think about perceptual information, we don’t have to actively pay attention, to interact with it, to glean the meaning from it. It’s a [indecipherable 0:11:32] , reflexive kind of engagement. In that sense, perception flows easy. It has very low viscosity.
Language though is laden with awareness and associativity, and it requires our attention to engage with it. We have to be aware in enacting with it. Language is highly viscous, and it takes more work to flow.
Right there, we have a little rough understanding of the areas where there’s more perceptual information. It’s more of a reflexive style of engaging meaning. The areas where it’s more language dominated, it’s a more attentive style of engaging with meaning.
We can refine these areas a little bit further, to say, "Let’s look at that region, where it’s basically little or no language, and it’s all perceptual information." That is a very visceral way, to engage with meaning in the world. Similarly in the area, where we have all language and very little perception, I kept that little channel acts.
If we have no perception, I don’t think we’re around. [laughs] We have a little buffer there. This is a very conceptual area. We’re wafting around in the world of ideas.
As we get more and more language, and we’ll look at this in more detail, it requires more intense concentration. Similarly, as we have more perceptual information coming at us, it requires more intense coordination.
If we have a lot of information, it triggers an emotional response. I’m not saying that this is the only place emotion occurs. We can have emotion associated with anything on the phase space. If you end up with a lot of information, you do end up with associated emotional response. If we get too much, we get overloaded, and we can’t function.
Let’s look at Twitter, before they introduced the Image Preview. Before Image Preview, I would say Twitter was pretty high up in the phase space over in the linguistic dominated area, even into some intense concentration. Breaking that down a little bit, let’s think about the goals. Anyone can have a variety of goals for going to Twitter. Maybe it’s intrigue.
You want to come across an article, idea or a picture that you would never see in any other way. Maybe it’s to humor, maybe it’s to sample some discourse of news, or check on what your friends are talking about, or your peers, any variety of reasons. The nature of the information on Twitter before the Image Preview was dominated by language. It was a stream of words.
We did have some perceptual information in the form of Avatars. Those were in a neat, orderly column on the side. They were always in the same place. They were glanceable if we wanted context. Our primary mode of engaging with the meaning of Twitter was scanning. This is different, than reading a book, where a book is linear, and then you say, "And then this happens, and then this happens, and then this happens."
Twitter is semantic jocks to position, where concept hopping. It requires a little bit more attention, to focus and scan, jumping around those concepts. We got really good at that. We had strategies for that. We adjusted our behavior to that. That is how we knew the meaning of Twitter. It was a very highly viscous concentration-intensive behavior.
When Twitter introduced inline images, I’ve realized I forgot to put cats in there. We ended up with these perceptual swaths, interrupting our concentration laid in scanning. Suddenly, we had to adjust. We either had a visually jump over and ignore the pictures, or mode switch, and reading and conceptually hopping, and then glance and glean, and then go back.
We’re all perfectly good at dealing with text and pictures at the same time. We probably don’t even notice it anymore. It fundamentally changed. It phase shifted, what it’s like to engage with meaning on Twitter. That happened.
This is a visual archive of five years of an online magazine called "Infosthetics." Morris Stiffener created this. What it is is all of the issues categorized by color. Starting at the top, you have the most recent issues. It goes down to the early issues at the bottom.
What this thing wants to do is select a category, say architecture. At a glance, you get a visual understanding of the distribution of the concept of architecture, across five years of this publication. You can see there’re some clusters down below, and got little spars, and then a few clusters near the top.
You can interact with it, and get more about the articles themselves. Something like the Infosthetics visual archive would be more down in the perceptual dominated region. This is a type of information behavior coupling that is conceptual anchors, to a primarily reflexive gleaning of the distribution of categories across time.
The movie "Her," we have a protagonist, Theodore, who forms an entire relationship based solely on words projected in his ear. He forms a complete relationship made of language. We would plot that way up in the corner, dominated by language in very little perception.
Shortly after that movie came out, Ben Shneiderman did a write-up. One of the things that he said is, "The future of computing will be more visual, than verbal. Voice is important for human relationships, but can’t keep up with the human minds desire for information abundance and swift decisions." He wrote the "Information Visualization" book. That’s the way he thinks.
I wanted to inject that thinking there when we may consider our projects. Our design projects likely fall in this general area. I’m sure there are lots of exceptions. Primarily, we have a lot of language. We deal with language. We deal with words. We do have some perceptual information in the form of how we lay out our information on a page. We give perceptual cues, to navigation and to functions. We worry about [indecipherable 0:18:01] structure for way-finding.
Our designs end up getting phase shifted for us, because of the surrounding ecosystem. We have these pervasive digital overlays. We end up phase shifting, into a much more intense, sometimes emotion-laden type of engagement. Pervasive digital overlays in the environment phase shift our designs to more extreme modes. That’s happening. That will only get worse over time.
To mitigate this, some designs use perceptual cues to give information about system states. Some of the displays, for the sensors and the Internet things, they got their glowy colors and fuzzy sounds and all of that. Often, those perceptual cues are just notifications for state changes. Once we go to engage with it, we’re phase shifted back to language. We’re changing our viscosity. We’re changing our mode of interacting with that.
I want to describe an example of fitting the nature of the information behavior coupling to the context. This designer prototyped this way of designing a car UI dashboard control panel. Instead of having a design where there’s already interface elements and labels laid out on there. It’s just a screen.
The driver puts some fingers on the screen, and the interface comes to that position. Depending on the number of fingers that touch the screen, you control something different. Two fingers control the radio, three fingers control the heater, and whatever. He’s got other ways to go beyond five.
All you need to do, then, is drag your fingers up or down to make the adjustment. You don’t have to be exactly up or exactly down. As long as you’re roughly up or down, it makes the adjustment for you.
This is a perception action coupling that stays perceptual the whole time. You can glance over, if you like, but it’s not really necessary to make this control. In this case, drivers are distracted. They’re trying to not hit a bicyclist. They’re trying to way find. They’re probably talking to a passenger. A perceptual information engagement suits the needs and the situation of a distracted driver.
When we think about the phase space of information, we need to consider the entire phase space in our designs. We need to note two phase space locations, the design itself and the design in the greater information ecosystem. We need to decide when to use the higher viscosity of language. It may be fully appropriate.
We’re a language-dominated society. That’s never going away, obviously. We need to decide when to offload some meaning to perception.
The next property that I want to talk about are texture facets. These are facets of perceptual and linguistic information that we can use to further tune what it’s like to engage with the nature of the information in our designs.
Let’s start with linguistic texture facets, and let’s look at some of the stuff we normally consider — controlled vocabulary, facet classification, taxonomy, ontology, content strategy. Let’s think about these in terms of phase space.
We could say controlled vocabulary would go in this highly language-dominated region of the phase space and require a bit of concentration to really dial in what we’re looking for for the terms. Taxonomy infuses a little bit of perceptual quality if we have some visual groupings and some hierarchical or even non-hierarchical structure.
Faceted classification is essentially a berry-picking journey, in the Marcia Bates sense, all around the linguistic-dominated area of the phase space by using the different facets to get around.
Content strategy, if we think about that in terms of adaptive content that changes its nature for view port, size, or other contextual triggers, is really this harmonious balance in intermapping across this perceptual language balance region of the phase space. There’s a lot of visual, perceptual…Even if we’re just moving the words around, there’s a lot of perceptual component to that.
Ontology is an information behavior coupling. Actually, it’s a series of information behavior couplings where the invariant structure, the information in this coupling, is the relationships among the conceptual entities in the ontology. It’s that relationship that forms the meaning.
Another aspect of linguistic information that we can dial is where do our concepts fall on the concept spectrum from abstract to concrete? Concrete concepts have a physical reference. They are spatially constrained, and it’s easy to visualize context.
A spoon. You can see a spoon. You can easily think about a spoon in a draw, in a bowl, stirring in a pot. A spoon is much lower in the linguistic area of the phase space, because it’s very easy to contextualize. It has a lower viscosity to engage with the meaning of spoon.
Whereas, something like calculus is much more abstract, and it takes more concentration. It has a higher viscosity to really engage with the meaning of calculus.
Vannevar Bush, in the 1940s, built a machine called the differential analyzer. This machine actually used physical relationships of gears, and levers, and mechanical movements to do calculus.
It’s said that those who used the analyzer acquired what Bush called a mechanical calculus, or an internalized knowledge of the machine like a combination of motor memory and mathematical skill learned directly from the machine. Bush described how one user did not understand calculus in any formal sense. He understood the fundamentals. He had it under his skin.
This machine phase shifted calculus from this abstract, highly viscous entity that we had to somehow engage with the meaning of down to this very visceral thing that we understood the principles about without even having to say words about it.
Other ways to phase shift some more abstract concepts are metaphor — there’s a lot written about that — and context priming. It’s said that concepts are abstract not in and of themselves, but they’re abstract, in part, because it’s hard for us to think of context in which to place them.
If we help prime people with context for abstract concepts, we actually phase shift them down to be a little more concrete for them.
Moving to perceptual texture facets of objects. There are all sorts of things we can do to dial in edges, surfaces, and textures. I’m not saying we need to be visual designers or information visualization specialists.
Recognizing that these are the things that we are tuned to pay attention to and detect, we can ask the question do we need to adjust surfaces, edges, and texture qualities to show our information objects are fixed versus movable, overlapped versus fused. Some of these basic things, that’s how we know our world.
When we get into the concept of way-finding and place, we have a wealth of information about that. Past information architecture summits, all of the information architecture books. We’ve got a fantastic tome of knowledge around that stuff.
When we think about the other aspect of perceptual texture facets, from the ecological psychology point of view, there are events. Objects have locomotion. They move. They have physical transformations, and there’s occlusion. Things overlap each other, and they get hidden and then revealed again.
We can add to that just the concept of having an event for information architecture. David Kirsh talks about the benefits of external representations.
He says, "Once we create an object around something, we can now do stuff with it. Meaningful stuff. We can do rearrangement. We can make a whole bunch of the same one and try different things on each. We can explore alternatives with multiple instances."
Karl Fast has an entire framework about epistemic interactions. These are events we also do to representations. Things like chunking, and cloning, and collecting, composing, cutting, fragmenting, probing, rearranging, and repicturing. He has another offering of events that are meaningful events that we can do to information.
What this is saying is we can use the materiality of diagrams. I’m rolling up a whole bunch of things in the world "diagrams." Physical representations, models, all of that. To enact and maintain meaningful events.
In general about texture facets, we can use texture facets like design dials to tune what it’s like to engage with the information in our designs.
The last principle I want to talk about is permeability. What obstructs our ability to engage and maintain the flow of meaning? If we look at the information behavior coupling, there maybe are some things that obstruct this, and there’s different ways to go about looking at this.
We can say that things have high permeability, so there’s no obstruction and everything is fine. We’re able to engage with the invariant structure, and we have the meaning going, and everything is fine. All the way down to low or no permeability where it’s fully obstructed.
Embodied cognition theory also gives us some principles around this that we can port over to information architecture. A lot of these are very similar things that we already look for, we already recognize, we already test for. It’s just porting them to the language of embodied cognition.
Detecting structure. If I am an actor-observer and I want to engage with information in the environment, if I don’t recognize that structure as something I can engage with, we can’t form a coupling. That’s also a spectrum — "I can’t recognize it at all," all the way up to "I can sort of recognize it," and "It’s tacit, and I’m ready to go."
Coordinating behavior. Even if we recognize the structure that we want to interact with or engage with, if we aren’t able to coordinate our behavior to that, we can’t form the information behavior coupling, either. We get progressively better at that with time and practice.
For example, if we go back to the outfielder problem. The first time a person catches a fly ball, they would be really lucky if they actually catch it, especially if it was hit with somebody batting number four.
Structural persistence is another issue. Sometimes we engage with information in just a brief moment, and then we move on to something else. Other times, we need that information to be there to maintain the coupling. If the information is transient or disappears, then that’s another factor that breaks that meaning.
With the outfielder, the outfielder is forming this perception action coupling with angle relationships to the ball. If that sun passes in front of his eyes and he’s suddenly blinded, that coupling is broken. Who knows where the ball’s going to end up.
I want to introduce another aspect of permeability that I’m calling tolerance. This one is how precise must our behavior be to maintain the coupling. If we have a wide tolerance for our information behavior couplings, then our behavior can move around a bit, the information might be able to shift a bit, and we can still maintain the meaning. We can still do what we need to do, get the work done.
If we have a narrow tolerance, then our behavior needs to be a little bit more precise to maintain that coupling. This gets back to that car dashboard interface that I was talking about before. This is a distracted driver. They’re not precision putting their finger on the panel while they’re talking to somebody else and not hitting the bicyclist.
That design has a wide tolerance. You can drag your fingers up and down and not be precise and still maintain the flow of meaning of working with your car and changing the environment of your car. It’s the concept of thinking about the precision that is required to engage with our information.
In situations needing simultaneous engagements, tolerance for precision is a design value. To roll up some of this, if we look through the lens of embodied cognition at the nature of information, and the nature of meaning, we recognize that our information architecture structures participate directly in the flow of meaning. We form that as an information behavior coupling. Further, because of this pervasiveness of information overlays everywhere, we really need to start considering the entire phase space in our designs, to deal with that, and phase shift some of this viscous language-laden behavior that we need, to get done to perception.
I’m not saying everything should be all perceptual. Just because these particular groups were so language dominated. Maybe we just think a little bit beyond that.
I want to leave you with a couple of future scenarios. Let’s say, what if multi-modal connected environments let us select the mode of engagement per our context? What if we have dual encodings of information that let us decide when to go perceptual, and when to attend to language? What if phase space redundancy is part of our adaptive content strategy? What if rules for digital agent behavior were information behavior couplings with specific precision tolerances?
This is just saying information behavior couplings are much more resilient and simple to both understand and design than creating linear, brittle rules, when we have a full collection of digital agents that now have emergent behavior. It’s just to remember that at core, we are tribal hunter, gatherer, poets. [laughs] We act in the world, and we understand with things. Thank you.
Audience Member 1: Hi, Marsha.
Audience Member 1: Excellent talk. At one point, you were talking about, when to use perception, when to use language, attentive-reflexive. Thinking about it, I’m now working in the Internet of Things. One of the things the Internet of Things is really good at is generating a shit ton of data. The way that most companies manage that is by presenting that in more, what you might call reflexive ways, lots of charts and graphs.
The challenge is, that doesn’t get people, to act. You want to have some type of feedback loop, that encourages people. My company’s take is trying to encourage people, to engage in better behaviors, more helpful behaviors. We’ve actually used language as a means, to capture insights from that data, and suggest, "Hey, maybe you should go for another walk or skip the [indecipherable 0:35:07] ."
I’m wondering this was a little more passive, in terms of understandability, but I’m wondering how you think about, how do you get the people who are using this systems too. It feels like attentive space is maybe a better one, if you want to encourage people, to engage in new behaviors, and not simply try to understand what’s going on.
Marsha: Yeah, for sure. What you described is a roundtrip, from language to perceptual, and then back again. Yeah, that’s a perfect example of strategically using different areas of the phase space. These examples were too linear of, "I start here, and I go there." It’s really not that simple. Yeah, I agree with you.
In that changing behavior, once you glean what you need from the visual stuff, then you need to have a conversation about it. You need a phase shift and back up to the world of meanings. You can put some understanding around it.
You’re basically dual encoding it for yourself, because you understand it in a visceral sense. You’re now dual encoding it back to language for yourself, in that sense.
Audience Member 2: Thank you. That was extraordinary. Thanks, Marsha. It’s extraordinary?
Audience Member 2: One of the things that struck me is the idea of viscosity, permeability and change. Information is river. It has water. Thinking of Heraclitus and his whole Panta Rhei, all things change. All things flow.
Can you talk a little bit about the nature of the personality of meaning, versus us as external people trying, to discern and design for that meaning in that meaning space? Because a lot of what you’re talking about is still focused on that person’s concept of meaning, now step back a little bit to us as designing for meaning.
Marsha: The principles of contextual inquiry and all of our methods, to assess the nature of meaning, it’s just thinking about meaning as, it’s not just this point thing that we measure once. "Oh yeah. I’ve got the meaning." It’s this continued interaction with the information. I don’t have answers, if that changes how we test stuff. I don’t know.
It’s just recognizing that nature, and still doing the same things, and still keeping it personal. This is not to say that suddenly it’s this abstract world that we can just apply general principles on. We still need that contextual research and all that. I don’t know, if I’m answering anything. [laughs]
Audience Member 3: That was uber.
Marsha: Thank you.
Audience Member 3: Do you apply this in your work? Are you applying it in your work? That’s the question that I want to know, just politely.
Marsha: Thank you. Yes. I have been thinking very much about that in my work. I’m in a new role. One of the things that we’re worrying about is helping mechanical engineers collaborate around a very visual, visceral thing that they design. They’re in this mode of making geometry and putting fillets around stuff and punching holes. It’s a very, very visceral thing that they’re doing. They have other inputs there.
When they have these moments, where they need to then, "I need to check my design with this person who’s owns another piece of it," or "Oh, it’s time to have a customer approval cycle." How do we help them in that context, ease into a phase shift of, "Now, I’m going to get into the world of language, and I’m going to spin up a collaboration with these other people." Yeah, I am thinking about that.
Abby: This is not a question. This is just "Wow," seriously. In case you didn’t catch it from the standing ovation and the additional applause, you just gave one of the most beautiful talks about Information Architecture, that anyone has ever given ever.
Abby: Third applause deserved.
Marsha: Abby made me cry. [laughs] Thank you so much.
Conversation between Vinay Gupta and James Burke about Blockchain, Ethereum, Politics and what is on Vinay’s mind.
1:00 "The Republicans were right" 3:50 Homelessness 4:30 Ethereum launch : programming the blockchain 5:55 Smart contracts 8:40 What does Ethereum do, who will use it? 9:25 Ogre: A prediction market 11:14 Providence: supply chain tracking of goods 11:50 The Blockchain & transforming music : Imogen Heap 13:40 Which areas of Ethereum are seeing the most interest by outside parties? Bankers… 14:42 Blockchains are the 3rd wave of fundamental technology 15:54 Blockchain speed
"This afternoon, I interviewed Tom Armitage. He’s a software designer who recently came to our attention because of a talk he gave recently, called "If Gamers Ran the World." In it, he puts forth the idea that in another 10 years, leaders who are the same age as Barack Obama or British Conservative Party leader David Cameron are now, will be children of the 1970s, and as such, more than likely the first leaders who grew up with video games as a core part of their way of interact with the world around them. What would that mean for how they would behave as leaders? A shorter version of this interview airs on the Jan 7th and 10th episode of Spark" — http://www.cbc.ca/spark/blog/2009/01/full_interview_tom_armitage.html
Here’s a live interview Recode’s Kara Swisher conducted with Mike Judge, Thomas Middleditch, Kumail Nanjiani and more about season four of the HBO show.
Page 1 of 24Older