Category: Design Thinking

some thoughts about “making things” & the internet Via Adam J. Kurtz’s Blog

ImagenImagen

some thoughts about “making things” & the internet

http://www.adamjkurtz.com

Here’s the link to the origibal post: http://jkjkjkjkjkjkjkjkjkjk.com/post/74766180941/some-thoughts-about-making-things-the-internet?utm_source=swissmiss&utm_campaign=8cad475930-RSS_EMAIL_CAMPAIGN&utm_medium=email&utm_term=0_2660ad4d17-8cad475930-393310379

The Future Of Branding Is Creating Real Connections Between Consumers And Products. Via: fastcoexist

It’s not about selling, it’s about giving control to the people.

WRITTEN BY Rita J. King

The future of branding belongs to storytellers who understand the hero’s journey in the context of modern, mobile life. The hero’s journey is a storytelling structure pervasive across cultures. It starts with a call to adventure, requires that the hero be connected to others, including a mentor. The hero will face extremely difficult challenges along the way. The hero ultimately wins and returns home, armed with new knowledge about herself, other people and the world.

Does your brand inspire people to respond to the call for adventure, whether through providing information, tools or a catalyst? Does it help them overcome the obstacles they will face on the path, either by making sure they have nourishment, transportation, tools, information or access to other people? Does it provide guidance, support, or a framework in which the story of the process, with all its ups and downs, can be documented and shared in real time? If your brand doesn’t serve any of the segments in the hero’s journey, you’re right to be concerned about the future.

Not the future of branding. Image: Flickr user Sarah Gilbert

Cecelia Wogan-Silva, the director of creative agency development at Google is tasked with growing brand advertising through Google’s platform. She accomplishes this, in large part, by inspiring thoughtful collaboration with intriguing insights and co-ideating with creative agencies at the beginning of the process instead of jumping in when distribution is the last bit of conversation left.

“We’d rather work on a cool idea together,” she said. “We try not to be product focused. Instead, we’re problem-focused. Working up a solution that’s only inclusive of what we do at Google is like dropping feta cheese off at the door of someone who doesn’t know they’re sitting on an entire Greek salad. We help them manifest the big idea that brings the salad together. We are in the business of sales but we don’t start with a pitch. we start with a conversation. We try to develop story engines. We ask: What story are you trying to tell? We want to launch a thousand ships together.”

The perceived need to master emerging technologies and engineer a viral video dominates much of the conversation in the world of branding. Clients want measurable proof of eyeballs on the screen, and creatives struggle with the expectation that they’ll be able to engineer a hit. But what is a hit? The trend toward the mean-spirited shock video filled with actors faking real-time reactions to disgusting pranks is the result of the mistaken belief that eyeballs equal success. This mentality is largely a relic of the measurement of success in television advertising, which isn’t surprising. The history of the advertising industry, Wogan-Silva said, is a string of attempts to reincarnate what came before in a new medium.

“The poster in the window got smaller as a print ad,” she said. “But it was just like the poster in the window. Then print ads got read on the radio. Then the concept transported itself to TV in the beginning with still pictures added to what were essentially radio spots. In each instance, advertisers didn’t take advantage to the fullest of the new medium. Our habit is to stick to legacy. Radio was a new technology. So was TV. The exponential release of new technology doesn’t change the need for percolation in the creative process.”

“There’s this automatic inclination to believe that new technology is creativity’s silver bullet. But invention of technology is different than innovative use of existing things. Great TV wasn’t born from the new platform from the get go. But the stories got better, the use of bookends in commercial buying was a new variation that came from careful, deeper consideration for what could be done with this amazing medium.

“The same is true for using digital platforms. Brand marketers waiting for the latest product to be the first to use it might miss the chance to do something extraordinary with what we already have before us. Something extraordinary is usually something that touches consumers and tells a story, it’s not just technology alone that builds a brand.”

Wogan-Silva believes that the concept of being a slave to the latest technology fad or ad unit will become a thing of the past.

“Instead, there’s value exchange brought to you by a brand,” she said. “What does that look like? Uber.” Google is an investor in Uber, an “app that connects you with a driver at the touch of a button.” Transportation is a natural part of your life experience, Wogan-Silva said. Brands that are focused on getting us places and connecting us to others, essentially offering sustenance, transportation and intelligence, are the brands of the future. Uber is welcomed, rather than invasive. “My sense of what a brand can do for me doesn’t come in the form of what it promises, but what it delivers to me. Uber sits on my body, on my mobile phone. Location speaks the language of intimacy.”

Intimacy will come in many forms in the future. Not only will objects be connected to each other, but they will be connected to you. Businesses will know more about you, your habits, the bits of data that together compose the very shape and texture of your life. All of this will be connected through objects on us and even in us, as well as in the cloud, that nebulous concept that is becoming more tangible all the time.

Drew Ormrod, Ogilvy’s Worldwide Account Director for IBM Midmarket, which serves small and mid-sized companies. Science House, where I’m the EVP for Business Development, is collaborating with IBM on a project that Ormrod manages from the Ogilvy side. In recent years, he has seen the evolution of consumer values head toward a greater need for trust and transparency.

“Customers want to buy what they need and not a bit more,” Ormrod said. “Also, they want to understand what they’re buying. As consumers develop a taste for the new from freshly-hatched web companies without excess baggage, established brands are turning toward a new model for innovation, often called Labs. Smaller, more agile and often beyond the usual rules of a company, Labs are expected to drive innovation to market from within a traditional company to allow them to compete with new brands. The new consumer is better connected, forms opinions faster and has a better understanding of how systems work.” This new knowledge can come paired with distrust toward traditional brands in favor of those born on the web.

“It’s a matter of putting the customer in control,” Ormrod said. “The future is built on more intelligent connections. Mobile is going to play a huge role. It adds value by connecting our virtual experience to our real experience.”

What does that mean, exactly? It means that brands like Zappos and Seamless, Airbnb, Kickstarter, and others are enabling the digital, mobile realm to serve as a portal into increased real-life access to goods, services, and new experiences. It also means that data is enabling companies to tailor those experiences to customers in real time, right where they are in the physical world.

The brand isn’t the hero, it is an enabler of the journey the customer is on. That requires a lot of listening, in order to understand the challenges each customer faces, and customization, in order to meet those needs. Ultimately, it requires the delivery of simplicity in an increasingly complex world. When the hero does get home after battling the forces of nature and humanity, she might want Uber to get her there and Seamless to deliver tacos right away. Adventure is hard work.

[Image via Shutterstock]

Deductive Versus Inductive By Gigi DeVault

Know When to Use Top-Down and When to Use Bottom-Up Approaches

Market research is grounded in the branch of philosophy known as logic. Two logical reasoning approaches are basic to research design. These approaches are known as deduction and induction.

Deductive Research

Deductive reasoning is a top-down approach that works from thegeneral to the specific. In empirical research, that means that a market researcher begins a study by considering theories that have been developed in conjunction with a topic of interest. This approach lets a market researcher think about research that has been already been conducted and develop an idea about extending or adding to that theoretical foundation. From the topical idea, the market researcher works to develop an hypothesis. This new hypothesis will be tested by the market researcher in the process of conducting a new study. Specific data that has been collected and analyzed in the new study will form the basis of the test of the hypothesis. The specific data will either confirm the hypothesis, or it will not. [It is important to note that an hypothesis that is not confirmed has not been proven false.]

Deductive Research Steps

 

  • GENERAL – Literature Search & Theories
  • Topic of Interest
  • Theory-related Idea
  • Hypothesis
  • Data Collection
  • Data Analysis & Hypothesis Testing
  • Confirm the Hypothesis or Not
  • Disseminate Findings

 

Inductive Research

Inductive reasoning is a bottom-up approach that moves from the specific to the general. In this case, specific refers to an observation made by the market researcher that eventually leads to broad generalization and theory. [It might be important to note – for discussions with colleagues or in public – that the term is bottom-up and not bottoms-up. Bottoms-up is a sort of toast for drinking, something that may seem entirely appropriate once the research study is completed.]

An inductive research methods approach begins with specific observations made by a market researcher who begins a study with an idea or a topic of interest, just as in a deductive approach to research. However, in an inductive approach, the researcher does not consider related theories until much further along into the research. From the observations or measures that the market researcher conducts – generally in the field and not in a laboratory setting – clusters of data or patterns begin to emerge. From these regularities or patterns, the market researcher generates themes that come analysis of the data.

Inductive Research Steps

 

  • SPECIFIC – Observations & Measures
  • Topic of Interest
  • Data Collection
  • Data Clusters or Patterns
  • Data Analysis
  • Themes Emerge
  • Generalizations
  • Disseminate Findings

 

Quantitative Research and the Hypothesis

If the market researcher is conducting quantitative research, at this point, theories can be considered. However, if the market researcher is conducting qualitative research, then the formal hypothesis testing does not take place. Rather, the market researcher may formgeneralizations based on the strength of the data and themes that have emerged.

Data collection and data analysis in qualitative research is iterative. That is to say, it data collection doesn’t happen all at once and then — as though the market researcher has thrown a switch — data analysis begins. Rather, some data is collected, which is considered by the researcher, and then some more data is collected and considered, and so on. At a certain point, when sufficient data clusters or patterns have emerged, the market researcher will decide that thedata collection can slow, stop, or change direction.

Data collection and data analysis in quantitative research are distinct stages. To mingle data collection and data analysis in the manner of qualitative research would compromise the integrity of the data. Some scientist would say that a lack of boundaries in the data collection and data analysis processes causes the data to become contaminated and the research to lack rigor. Findings from such compromised research would not be viewed as robust.

Causal Inquiry, Exploratory Inquiry, and Everything In-Between

Bottom-up research methods feel more unstructured, but they are no less scientific than structured top-down research methods. Because each type of research approach has its own advantages and disadvantages, it is not uncommon for a study to employ mixed methods. A market researcher who uses mixed methods applies a deductive research approach to the components of the study that shows strong theoretical ties. Alternately, an inductive research approach is applied to the components of the study that seem to require a more exploratory inquiry.

Its a misrepresentation to form a mental picture of deductive approaches and inductive approaches as two sides of the same coin. In practice, they are two ends of a continuum. Deductive research is associated with linearity and a search for causal relationships. Inductive research is associated with depth of inquiry and descriptions about phenomena. Mixed methods can be placed at about mid-point on that continuum with an emphasis on research breadth.

This article contains a much simplified explanation about the different types of deduction and inquiry. There are many layers to market research. The content in this article just begins to scratch the surface. For instance, if we consider the philosophical grounding of deductive and inductive reasoning, we might refer to the approaches as positivistic and naturalistic.

Keep up with the site by signing up for my free newsletter!

“Like” Market Research on Facebook

Follow me on Twitter.

Link to original article: http://marketresearch.about.com/od/market-research-quantitative/a/Market-Research-Deductive-Versus-Inductive.htm

 

Deductive Reasoning Versus Inductive Reasoning By Ashley Crossman

 
In science, there are two ways of arriving at a conclusion: deductive reasoning and inductive reasoning.

Deductive Reasoning

Deductive reasoning happens when a researcher works from the more general information to the more specific. Sometimes this is called the “top-down” approach because the researcher starts at the top with a very broad spectrum of information and they work their way down to a specific conclusion. For instance, a researcher might begin with a theory about his or her topic of interest. From there, he or she would narrow that down into more specific hypotheses that can be tested. The hypotheses are then narrowed down even further when observations are collected to test the hypotheses. This ultimately leads the researcher to be able to test the hypotheses with specific data, leading to a confirmation (or not) of the original theory and arriving at a conclusion.

An example of deductive reasoning can be seen in this set of statements: Every day, I leave for work in my car at eight o’clock. Every day, the drive to work takes 45 minutes I arrive to work on time. Therefore, if I leave for work at eight o’clock today, I will be on time.

The deductive statement above is a perfect logical statement, but it does rely on the initial premise being correct. Perhaps today there is construction on the way to work and you will end up being late. This is why any hypothesis can never be completely proved, because there is always the possibility for the initial premise to be wrong.

Inductive Reasoning

Inductive reasoning works the opposite way, moving from specific observations to broader generalizations and theories. This is sometimes called a “bottom up” approach. The researcher begins with specific observations and measures, begins to then detect patterns and regularities, formulate some tentative hypotheses to explore, and finally ends up developing some general conclusions or theories.

 

An example of inductive reasoning can be seen in this set of statements: Today, I left for work at eight o’clock and I arrived on time. Therefore, every day that I leave the house at eight o’clock, I will arrive to work on time.

While inductive reasoning is commonly used in science, it is not always logically valid because it is not always accurate to assume that a general principle is correct. In the example above, perhaps ‘today’ is a weekend with less traffic, so if you left the house at eight o’clock on a Monday, it would take longer and you would be late for work. It is illogical to assume an entire premise just because one specific data set seems to suggest it.

Actual Practice

By nature, inductive reasoning is more open-ended and exploratory, especially during the early stages. Deductive reasoning is more narrow and is generally used to test or confirm hypotheses. Most social research, however, involves both inductive and deductive reasoning throughout the research process. The scientific norm of logical reasoning provides a two-way bridge between theory and research. In practice, this typically involves alternating between deduction and induction.

 

A good example of this is the classic work of Emile Durkheim on suicide. When Durkheim pored over tables of official statistics on suicide rates in different areas, he noticed that Protestant countries consistently had higher suicide rates than Catholic ones. His initial observations led him to inductively create a theory of religion, social integration, anomie, and suicide. His theoretical interpretations in turn led him to deductively create more hypotheses and collect more observations.

References

Babbie, E. (2001). The Practice of Social Research: 9th Edition. Belmont, CA: Wadsworth Thomson.

Shuttleworth, Martyn (2008). Deductive Reasoning. Retrieved November 2011 from Experiment Resources: http://www.experiment-resources.com/deductive-reasoning.html

Mucho por leer

  1. A la deriva – Horacio Quiroga
  2. Aceite de perro – Ambrose Bierce
  3. Algunas peculiaridades de los ojos – Philip K. Dick
  4. Ante la ley – Franz Kafka
  5. Bartleby el escribiente – Herman Melville
  6. Bola de sebo – Guy de Mauppassant
  7. Casa tomada – Julio Cortázar
  8. Cómo se salvó Wang Fo – Marguerite Yourcenar
  9. Continuidad de los parques – Julio Cortázar
  10. Corazones solitarios – Rubem Fonseca
  11. Dejar a Matilde – Alberto Moravia
  12. Diles que no me maten – Juan Rulfo
  13. El ahogado más hermoso del mundo – Gabriel García Márquez
  14. El Aleph – Jorges Luis Borges
  15. El almohadón de plumas – Horacio Quiroga
  16. El artista del trapecio – Franz Kafka
  17. El banquete – Julio Ramón Ribeyro
  18. El barril amontillado – Edgar Allan Poe
  19. El capote – Nikolai Gogol
  20. El color que cayó del espacio – H.P. Lovecraft
  21. El corazón delator – Edgar Allan Poe
  22. El cuentista – Saki
  23. El cumpleaños de la infanta – Oscar Wilde
  24. El destino de un hombre – Mijail Sholojov
  25. El día no restituido – Giovanni Papini
  26. El diamante tan grande como el Ritz – Francis Scott Fitzgerald
  27. El episodio de Kugelmass – Woody Allen
  28. El escarabajo de oro – Edgar Allan Poe
  29. El extraño caso de Benjamin Button – Francis Scott Fitzgerald
  30. El fantasma de Canterville – Oscar Wilde
  31. El gato negro – Edgar Allan Poe
  32. El gigante egoísta – Oscar Wilde
  33. El golpe de gracia – Ambrose Bierce
  34. El guardagujas – Juan José Arreola
  35. El horla – Guy de Maupassannt
  36. El inmortal – Jorge Luis Borges
  37. El jorobadito – Roberto Arlt
  38. El nadador – John Cheever
  39. El perseguidor – Julio Cortázar
  40. El pirata de la costa – Francis Scott Fitzgerald
  41. El pozo y el péndulo – Edgar Allan Poe
  42. El príncipe feliz – Oscar Wilde
  43. El rastro de tu sangre en la nieve – Gabriel García Márquez
  44. El regalo de los reyes magos – O. Henry
  45. El ruido del trueno – Ray Bradbury
  46. El traje nuevo del emperador – Hans Christian Andersen
  47. En el bosque – Ryonuosuke Akutakawa
  48. En memoria de Paulina – Adolfo Bioy Casares
  49. Encender una hoguera – Jack London
  50. Enoch Soames – Max Beerbohm
  51. Esa mujer – Rodolfo Walsh
  52. Exilio – Edmond Hamilton
  53. Funes el memorioso – Jorge Luis Borges
  54. Harrison Bergeron – Kurt Vonnegut
  55. La caída de la casa de Usher – Edgar Allan Poe
  56. La capa – Dino Buzzati
  57. La casa inundada – Felisberto Hernández
  58. La colonia penitenciaria – Franz Kafka
  59. La condena – Franz Kafka
  60. La dama del perrito – Anton Chejov
  61. La gallina degollada – Horacio Quiroga
  62. La ley del talión – Yasutaka Tsutsui
  63. La llamada de Cthulhu – H.P. Lovecraft
  64. La lluvia de fuego – Leopoldo Lugones
  65. La lotería – Shirley Jackson
  66. La metamorfosis – Franz Kafka
  67. La noche boca arriba – Julio Cortázar
  68. La pata de mono – W.W. Jacobs
  69. La perla – Yukio Mishima
  70. La primera nevada – Julio Ramón Ribeyro
  71. La tempestad de nieve – Alexander Puchkin
  72. La tristeza – Anton Chejov
  73. La última pregunta – Isaac Asimov
  74. Las babas del diablo – Julio Cortázar
  75. Las nieves del Kilimajaro – Ernest Hemingway
  76. Las ruinas circulares – Jorge Luis Borges
  77. Los asesinatos de la Rue Morgue – Edgar Allan Poe
  78. Los asesinos – Ernest Hemigway
  79. Los muertos – James Joyce
  80. Los nueve billones de nombre de dios – Arthur C. Clarke
  81. Macario – Juan Rulfo
  82. Margarita o el poder de Farmacopea – Adolfo Bioy Casares
  83. Markheim – Robert Louis Stevenson
  84. Mecánica popular – Raymond Carver
  85. Misa de gallo – J.M. Machado de Assis
  86. Mr. Taylor – Augusto Monterroso
  87. No hay camino al paraiso – Charles Bukowski
  88. No oyes ladrar los perros – Juan Rulfo
  89. Parábola del trueque – Juan José Arreola
  90. Paseo nocturno – Rubem Fonseca
  91. Regreso a Babilonia – Francis Scott Fitzgerald
  92. Solo vine a hablar por teléfono – Gabriel García Márquez
  93. Sobre encontrarse a la chica 100% perfecta una bella mañana de abril – Haruki Murakami
  94. Tlön, Uqbar, Orbis Tertius – Jorge Luis Borges
  95. Tobermory – Saki
  96. Un día perfecto para el pez plátano – J.D. Salinger
  97. Un marido sin vocación – Enrique Jardiel Poncela
  98. Una rosa para Emilia – William Faulkner
  99. Vecinos – Raymond Carver
  100. Vendrán lluvias suaves – Ray Bradbury
También:

Comentario sobre el artículo “Starbucks Reinvents The Coffee Cup”

La experiencia con las marcas nos debe de suceder a través de todos los sentidos. Starbucks ya lo sabía y cuándo muchos podrían haberse preguntado qué más podía mejorar en sus tiendas, es cuándo tocan uno de los puntos más evidentes y al mismo tiempo el más importante, el vaso en el que bebes.

Starbucks Reinvents The Coffee Cup. fastcodesign.com

The coffee brand abandons its ubiquitous paper cup for a new highly styled version aimed at tea drinkers.

Tal vez Starbucks no es el mejor café, pero al sumar todos los puntos de contacto que van desde la apariencia exterior del lugar, las señales, la decoración, los muebles, la música, la barra, el orden de los productos, la lectura, el trato, el ticket, etc, etc. La suma de todo, sí hace al mejor café.

¿Puedes decir lo mismo de otros productos?, la verdad es que son contados.

¿Te imaginas cómo debería de ser la experiencia de pollo Bachoco o Pilgrims?, Los espectaculares de Bachoco son famosos, pero ya no son suficiente. Ni que decir de Gelatinas, Polvo para hornear o escobas. Todas y cada una de las categorías o familias de productos están llenas de oportunidades. La diferencia recae en como son vistas por los dueños, directores y mercadólogos, por las tiendas y por la gente que convive con ellas.

Claro, puedo suponer que muchos me van a decir que no se puede llegar a tanto. ¿Será?. Yo veo múltiples posibilidades y opciones. Aunque yo trabajo con empresas grandes y pequeñas estoy seguro que se puede llegar mucho más lejos, mucho mejor. Con detalles tan sencillos como… el vaso para llevar la bebida de Starbuck.

Mucho se habla de innovar, yo los invita innovar. A vivir conmigo este mundo del que tanto se habla de ideas, de nuevas soluciones. Ese mundo esta aquí y yo contigo.

Buen fin de semana.

Aquí el vínculo al artículo original.

http://www.fastcodesign.com/3020733/starbucks-reinvents-the-coffee-cup?partner=newsletter

Federico

Federico Hernández Ruiz
Consultor fundador en asimetagraf

En diseño estratégico para marcas y productos, 
diseñamos experiencias, impulsamos razones para actuar.
+52 (442) 161-2784 Ext. 101,

The Art of Noise | Noisy Dirty Shortcuts by Paul Valerio at method

The Art of Noise

Noisy Dirty Shortcuts

October 2012

One of the endearing characteristics of Poker is how it unapologetically embraces the power and value of lying. Unlike every other game in a casino, understanding the laws of probability are only part of the game. To be a really good poker player, you have to be both a really convincing bluffer, and an astute enough observer of your opponents to know when they are bluffing themselves.

Of course all poker players know this, so they do their best to avoid showing their “tells”; those unconscious behavioral ticks that might communicate how good or bad their cards really are to the rest of the players. They might drum their fingers, wrinkle their brow, play with their chips, change their breathing, the list goes on. The specific cards they’re holding are the key data their opponents want to access, but there’s real value in hiding the exact nature of that data for as long as possible.

If all you do in a poker game is stare at the back of your opponent’s hand and add up the cards visible on the table and in your own hand, you’re never going to notice that your opponent puts his hand in front of his mouth when he’s holding good cards. Once you figure that out, you’re going to look for that first, and then use the other obvious data as support. Read more about poker tells.

Fortunately, other complex systems of data have their own unintended, but revealing, “tells” themselves. Just like in poker, the key is in being observant and clever enough to know what to look for, and where to look for it. As you may have expected by now, these tells usually depend upon having heavy doses of noise built into them. At first glance, that noise might seem completely irrelevant to the topic of interest, but in fact is tied so tightly to an underlying truth that, it can provide a valuable shortcut to the nature of the systems we’re trying to understand. That’s why I’m so interested in them. I’ve collected a few of these here, but would love to know about any others that you use yourself, or have heard of.

1. The Economist Magazine’s Big Mac Index

In a globally integrated economy, the true relative value of currencies is a crucial piece of data, and an incredibly complex one. Those who trade currencies looking for arbitrage opportunities are always looking for even the most fleeting edge in understanding this complexity in real time, and investors looking for longer-term trends can crunch all sorts of government and international trade data to figure it out. Or, you can just track the price of a McDonald’s Big Mac around the world. That’s what The Economist magazine did back in 1986, and they’ve kept it up ever since (check out the link here). Once the chuckling subsided, many financial analysts recognized the wisdom and power behind the metric, and it has since generated genuine respect for its accuracy.

The insight behind it is in recognizing how unique a product a Big Mac is, and how its price is tied to so many important and complex metrics. Unlike most other globally available consumer products, it must be made locally. With the exception of beef-averse India, it is made with the same recipe and ingredients, according to a strict protocol, using local labor, on-site in the restaurant. That one sentence necessarily combines variables such as agricultural production and commodity prices, the efficiency of transportation and supply networks, wage rates, real estate values, energy costs, marketing and media costs, local and national taxation policies and rates, inflation and competitive pressures, and on and on. And, it’s a consumer product that can be purchased often, so it needs to adapt to changing conditions far more frequently than durable goods like cars or housing. All of these factors are then represented, completely unintentionally but out of necessity, in the price of that humble Big Mac.

The Chinese government might come up with all kinds of official rebuttals to international demands to more accurately value its currency, but the Big Mac Index is telling a different story. You just can’t deny the authority of what the world is willing to pay for two all beef patties special sauce lettuce cheese pickles onions on a sesame seed bun.

2. Corrugated Box Sales

Closer to home, there are all sorts of attempts made to figure out how the economy is going, or not going, in a given year. These are usually a combination of extrapolations of previous quarterly performance, and a tangle of expert expectations, predictions, and surveys of intangibles like consumer confidence. Of course, these work to some extent, but many are based on perceptions and public statements that are not always truthful, for the sake of either political spin, or to influence investor confidence. It all gets complicated and contradictory very quickly.

One simple metric that has withstood the test of time is the humble cardboard box order. Tracking sales of corrugated boxes turns out to be an elegant indicator of a wide range of economic factors, incorporating both the precision needed for running intricate supply chains, and the intangibles of sentiments like confidence, fear, and risk management.

Its accuracy is traced to both the ubiquity of corrugated boxes (80% of all non-durable goods are shipped in them) across manufacturers and retailers, and in their position as a cost center for these businesses. You cannot ship merchandise, and therefore book sales, without boxes, no matter what your product inventory is like. So, running out of boxes would be a horrible mistake to make. Yet, it costs money to buy and store excess boxes, so companies are reluctant to order more than they really need. So, orders for boxes neatly represent the market for, well, almost everything, as well as the intangibles of forward-looking expectations in exactly the time frame that producers and retailers are willing to commit money towards them.

I’ve been running market research projects for 27 years, but I can’t think of a survey question that would deliver that same combination of information so succinctly and truthfully.

3. Shoes, Not Eyes, Are The Windows To The Soul.

Customer demographics are an important metric for any business to understand, but even when you can afford to generate or buy that kind of data, it often offers little true insight to what those actual people are really like. And many times, small and medium-size businesses really can’t afford to spend money on that data.

If your target audience is visible at a specific place, such as shopping malls, event venues, or other public spaces, you can learn a great deal about them by simply parking yourself in a central location and watching nothing but their shoes go by. Shoes are another one of those data-intensive “tells” that can speak volumes to an astute observer. Again, it’s because of all the noise-like data that is unintentionally revealed through a person’s choice of shoe.

Shoes are one accessory that are so common that their absence in many situations is itself a powerful indicator; if you’re not at a beach or pool, not having shoes on in public is either an indication of a willful statement of independence, or an unfortunate indicator of genuine poverty. It’s one reason why having to take your shoes off for airport security feels like such an intrusion beyond its inconvenience.

But for the most part, the attention people do or do not pay to their shoes is a reliable reflection of their attitudes towards fashion vs comfort, their income level, their evaluation of the importance of the location or event they are attending, even their relationship status (for instance, tall women wearing spike heels are usually already in a relationship; they’re not worried about being taller than the men they might run into). Like many of these kinds of metrics, they are best interpreted by those with a really deep understanding of the metric itself; the more you know about shoes, the more you can know about people by looking only at their shoes.

In each of these examples, the common element is the counter-intuitive step of deliberately moving your focus away from what you really want to see. It’s like those often maddening 3D posters, where the only way to see the hidden image is to un-focus your eyes while continuing stare right at the image. (like poor Mr. Pitt was trying to do in an episode of Seinfeld). The picture is in there, both hidden and defined by the noise that surrounds it.

If you do develop your own metrics like this, you need to consider the risk in sharing it. As would be the case with figuring out a “tell” in your poker opponent, revealing the metric could destroy its effectiveness. The operative skill in this approach is in the ability to notice, and that ability depends on a willingness to pay attention to noise in a broader context, in allowing the mindfulness required to recognize the patterns to be accessed, and in having the freedom to fail for a long time before that expertise is developed.

So, consider what you are already an expert about, even if it might seem to have nothing to do with the topic you want to understand. How does that expertise allow you to recognize and interpret what someone else would completely miss? That’s what we’d love to know. Please post your own Noisy Dirty Shortcuts in the comment area below.

Original Link: http://method.com/ideas/10×10/the-art-of-noise

What Astronauts And Toddlers Can Teach You About Consumers. Via: FastCo.

By: PAUL VALERIO

 

WHEN COLLECTING DATA ON CONSUMER BEHAVIOR, TUNE INTO THE NOISE–THE PATTERNS THAT DRIVE HUMAN PERCEPTION, ARGUES METHOD’S PAUL VALERIO.

If you were forced to rely on only two target audiences to guide all your future design work, I’d strongly recommend using astronauts and toddlers. Fortunately, the connection between them goes beyond the design of their underwear to the nature of perception and expertise, and in what we treat as valid data, and what we choose to ignore as “noise”–the extraneous details, out-of-category input, the anecdotal tidbits. As it turns out, noise is much more valuable for useful design insights than you might think.

First, the astronauts. One little-known quirk of the Apollo moon landings was the difficulty the astronauts had judging distances on the Moon. The most dramatic example of this problem occurred in 1971 during Apollo 14, when Alan Shepard and Edgar Mitchell were tasked with examining the 1,000-foot-wide Cone Crater after landing their spacecraft less than a mile away. After a long, exhausting uphill walk in their awkward space suits, they just couldn’t identify the rim of the crater. Finally, perplexed, frustrated, and with the oxygen levels in their suits running low, they were forced to turn back. Forty years later, high-resolution images from new lunar satellites showed they had indeed come close–the trail of their footprints, still perfectly preserved in the soil, stop less than 100 feet from the rim of the crater. A huge, 1,000-foot-wide crater, and they couldn’t tell they were practically right on top of it. Why?

It should have been easy for them, right? These guys were trained as Navy test pilots; landing jets on aircraft carriers requires some expertise in distance judgment. They also had detailed plans and maps for their mission and had the support of an entire team of engineers on Earth. But their expertise was actually part of the core problem. The data their minds were trying to process was too good. All of the “noise” essential to creating the patterns their minds needed to process the data accurately was missing. And patterns are the key to human perception, especially for experts.

Consider everything that was missing up there. First, there’s no air on the Moon, so there’s no atmospheric haze, either. Eyes that grew up on Earth expect more distant objects to appear lighter in color and have softer edges than closer things. Yet everything on the Moon looks tack-sharp, regardless of distance. Second, the lack of trees, telephone poles, and other familiar objects left no reference points for comparison. Third, since the Moon is much smaller than the Earth, the horizon is closer, thus ruining another reliable benchmark. Finally, the odd combination of harsh, brilliant sunshine with a pitch-black sky created cognitive dissonance, causing the brain to doubt the validity of everything it saw.

Ironically, that kind of truthful, distortion-free data is usually what experience designers want to have as input for their decision-making, no matter what they’re trying to do. We tend to believe that complex systems are the tidy, linear sum of the individual variables that create them. But despite the pristine environment of the Moon, the Apollo astronauts were repeatedly baffled when it came to simple distance and size perceptions, even after each team came back from the Moon and told the next team to be aware of it.

Meanwhile, the toddlers I mentioned earlier provide a corresponding example of the power of patterns in perception. When my first child was about 4, we came across a wonderful series of picture books called Look-Alikes, created by the late Joan Steiner. Each book has a collection of staged photographs of miniature everyday scenes like railway stations, city skylines, and amusement parks created entirely from common, found objects (see some examples here). Without any special adornment, a drink thermos masquerades as a locomotive, scissors become a ferris wheel, and even a hand grenade makes for a very convincing pot-belly stove. The entire game is to un-see the familiarity of the scene, and identify all the common objects ludicrously pretending to be something other than what they are. There’s no trick photography involved, but you can look at each picture for hours and not “see” everything that’s right there in front of you. You know it’s a trick, but you keep falling for it over and over.

The really amazing part is that the toddler, a true novice with only a few years’ experience in seeing, completely understands the scenes she’s looking at, even though every individual piece of “data” she’s looking at is a deliberate lie. Yet the pattern of data that creates the scene is “perfect.” We already know what those scenes are supposed to look like before we even see the book’s version of them, so we unconsciously project that pattern onto what we’re looking at, even to the point of constantly rejecting the contrary data our eyes are showing us. There is in fact no amusement park in the photograph I called an amusement park. But I see it anyway.

In data-processing parlance, the signal-to-noise ratio of the moonscape was perfect (actually, infinitely high), and zero for Look-Alikes pages (the whole joke is that there really was no signal there in the first place). Yet a toddler can read the noisy scene perfectly, and the seasoned test pilots were baffled by the noiseless scene. How can this be?

The lesson is that patterns drive perception more so than the integrity of the data that create the patterns. We perceive our way through life; we don’t think our way through it. Thinking is what we do after we realize that our perception has failed us somehow. But because pattern recognition is so powerfully efficient, it’s our default state. The thinking part? Not so much.

This just might be why online grocery shopping has yet to really take off. The average large U.S. supermarket offers about 50,000 SKUs, yet a weekly grocery shopper can easily get a complete trip done in about 30 minutes. We certainly don’t feel like we’re making 50,000 yes/no decisions to make that trip, but in effect we actually do. Put that same huge selection online, and all of those decisions are indeed conscious. Even though grocery shopping is a repetitive, list-based task, the in-store noise of all those products that aren’t on your list give you essential cues to finding the ones that are, and in reminding you of those that were not on your list but you still need. That’s even before you get to the detail level, where all the other sensory cues tell you which bunch of bananas is just right for you. So despite all the extra effort and hassle involved in going to the store in person, it still works better because of, not in spite of, the patterns of extraneous noise you have to process to get the job done.

To account for the role of noise within the essential skill of pattern recognition, we need to remind ourselves how complex seemingly simple tasks really are. Visually reading a scene, whether it’s a moonscape, a children’s book illustration, a grocery store, or a redesigned website, is an inherently complex task. Whenever people are faced with complexity (i.e., all day, every day), they use pattern recognition to identify, decipher, and understand what’s going on instantly, instead of examining each component individually. The catch is that all of the valuable consumer thought processes we want to address–understanding, passion, persuasion, the decision to act–are complex.

However, the research we use to help us design for these situations usually tries to dismantle this complexity. It also assumes a user who is actually paying attention, undistracted, in a clean and quiet environment (such as a market research facility), and cares deeply about the topic. Then we “clean” the data we collect, in an attempt to remove the noise. And getting rid of noise destroys the patterns that enable people to navigate those complex functions. So we wind up relying on an approach that does a poor job of modeling the system we’re trying to influence.

The challenge is to overcome the seemingly paradoxical notion that paying attention to factors completely outside our topic of interest actually improves our understanding of that topic. Doing so requires acknowledging that our target audience may not care as much about something as we do, even if that topic represents our entire livelihood. It requires a broader definition of the boundaries of what that topic is, and including the often chaotic context that surrounds it in the real world. It also requires a more than casual comfort level with ambiguity: Truly understanding complex systems involves recognizing how unpredictable, and often counterintuitive, they really are.

This is why ethnographic research is so popular with all kinds of designers. The rich context ethnographies offer is full of useful noise; the improvising people do to actually use a product, the ancillary details that surround it, and the unexpected motivations a consumer might bring to its use. These are all easier to access via a qualitative, on-location approach than they are via a set of quantitative crosstabs or sitting behind a mirror watching a focus group. It’s also a powerful human-to-human interface, in which the designer uses his innate pattern-recognition capability to analyze patterns in user behavior.

What often gets overlooked is the role noise can and should play in quantitative research. Most designers’ avoid quantitative research because of the clinically dry nature of the charts it produces, and the often false sense of authority that statistically projectable data can wield. However, only quantitative research can reveal the kind of perceptual patterns that are invisible to qualitative methods, and the results needn’t be dry at all. The solution is to appropriately introduce the right kind of noise to quantitative research, to deliberately drop in the necessary telephone poles, trees, and haze that allows those higher-level perceptual patterns to be seen and interpreted.

How audio dithering works.

Fortunately, there’s already a model for this. When analog music is digitally recorded, some of the higher highs and lower lows are lost in the conversion. Through a process called dithering, audio engineers can add randomized audio noise to the digital signal. Strangely enough, even though the added noise has nothing to do with the original music, adding it actually improves the perceived quality of the digital audio file. The noise fills in the gaps left by the analog-to-digital conversion, essentially tricking your ear into hearing a more natural-sounding sound. The dithered audio really isn’t more accurate, it just sounds better, which is more important than accuracy. Returning to our opening examples, the moonscape was in dire need of dithering, while the Look-Alikes scenes were already heavily dithered. And the real world in general is heavily dithered.

So, for quantitative research aimed at guiding the design process, the trick is to value meaning above accuracy. Meaning can be gleaned via the noise you can add to the quantitative research process by including metrics outside the direct realm of your topic area. It means considering what else is adjacent to that topic area, acknowledging the importance of respondent indifference as well as their preferences, and recognizing what kind of potentially irrational motivations are behind the respondents’ approach to the topic, or the research itself.

At Method, we’ve developed a technique for observing these perceptual patterns in quantitative data by using perceptions of brands far afield of the category we’re designing for. Essentially, it’s a dithering technique for brand perceptions. This technique often displays an uncanny knack for generating those hiding-in-plain-sight aha moments that drive really useful insights. There are doubtless many other approaches you can employ once you make the leap that acknowledges the usefulness of noise in your analysis.

But no matter what format of research you use in your design development process (including no formal research at all), there are some guidelines you can follow to allow the right amount of useful noise to seep into your field of view, so that your final product does not wind up being missed on the moonscape of the marketplace:

• A LITTLE HUMILITY WORKS WONDERS.

Recognizing that you’re not the center of your target audience’s universe allows you to understand how you fit in. Be sure to take honest stock of just where your target audience places your topic area on their list of priorities.

• STEP BACK FAR ENOUGH TO ALLOW PATTERNS TO EMERGE.

No matter what metrics you’re using, consider looking several levels above them–or next to them–to identify patterns that are impossible to see when you’re too close to the subject.

• GAUGE THE LEVEL OF EXPERTISE OF YOUR TARGET AUDIENCE.

How familiar is your target audience with your subject? Are they experts or novices, and how are you defining that? Generally, the higher the level of expertise, the higher the dependence on pattern recognition. Novices carefully and slowly compare details; experts read patterns quickly and act decisively.

• CHECK THE DATA DUMPSTER BEFORE EMPTYING.

No matter where your data comes from, think about what has been omitted. Was that distracting noise that was tossed, or crucial context?

By taking a look at the entire picture–instead of isolating a single data point–you open up opportunities for understanding the motivations, reasons, and outlying factors that impact data. Contrary to popular practice of stripping out noise, noise is in fact critical to the generation of deep insights that allow us to design better and more effective brands, products, and services.

[Image: Supermarket via Shutterstock]

The Latin American and Hispanic Digital Opportunity: Are You Prepared? -Via: Juan Martinez at http://www.outbrain.com

JUAN

June 11, 2012

The burgeoning Latin American digital media market represents an amazing opportunity for content creators. Representing more than 7% of global Internet users, Latin America is home to emerging markets, Brazil and Argentina, where 79% and 28% of the population consumes content on the Web, respectively — a combined population of more than 100 million. If you add Mexico to the list, where 30% of the country’s 112 million people use the Internet, the list grows to 130 million Internet users.

In Latin America, Facebook accounted for 25% of all time spent online and social networking in general accounted for nearly 30% of online minutes at the end of the year, an increase of 9.5% over the past year. In addition to social media usage, online video consumption increased more than 10% across Brazil, Mexico, Argentina and Chile, and online retail visits increased 30%. The number of searches in 2011 increased 38% to more than 21 billion and, with an average of 173 searches per searcher, Latin America leads the globe in search frequency.

The U.S. Hispanic market represents an equally important demographic. More than 33 million Hispanics were online in September of last year, representing 15% of the U.S. online market, a demographic that is growing three times faster than the general market online. Eighty percent of online Hispanics use a search engine each month and 80% of online Hispanics visit Facebook each month.

Content creators must focus on the Latin American and U.S. Hispanic markets in order to maximize overall content viewership and engagement. Reaching English-speaking content consumers in the U.S. and south of the border has never been more important and will only become more important in the coming years. Moreover, creating and distributing Spanish-language content in the U.S. and Latin America is an equally important objective.

Understanding and working within these communities will enable brands and publishers to attract a portion of the world that will dominate digital content consumption in the coming years. The creation of relevant content and finding partners to help distribute that content must be among your top priorities.

With all of this in mind, Outbrain is honored to have been named the Top Digital Media Innovator in the Latin World at the 2012 Latin American Advertising and Media Awards at the Portada Hispanic Media Conference. The award honors companies in Latin America, the United States Hispanic market and Spain for excellence in media and digital advertising. We are particularly humbled to have been nominated alongside the following innovators:

  • Hunt Mobile Ads
  • Impaktu
  • Jumba Mobile Network
  • Kontextua
  • Matomy México
  • Netbangers
  • Premier Retail Networks México
  • Terra Live Music
  • Vostu

“So much of the Portada Conference focused on the power of storytelling and producing great content,” said Erik Cima, VP of Hispanic Markets at Outbrain. “Winning this award is satisfying because we’re playing a part in helping the Hispanic and Latin American markets surface and distribute that great content.”

Image via Captura Group

Link to original article: http://www.outbrain.com/blog/2012/06/the-latin-american-and-hispanic-digital-opportunity-are-you-prepared.html

Book review: “Thinking, fast and slow” by Daniel Kahneman via: http://backreaction .blogspot.mx

THURSDAY, AUGUST 09, 2012

Book review: “Thinking, fast and slow” by Daniel Kahneman

Thinking, Fast and Slow
By Daniel Kahneman
Farrar, Straus and Giroux (October 25, 2011)

I am always on the lookout for ways to improve my scientific thinking. That’s why I have an interest in the areas of sociology concerned with decision making in groups and how the individual is influenced by this. And this is also why I have an interest in cognitive biases – intuitive judgments that we make without even noticing; judgments which are just fine most of the time but can be scientifically fallacious. Daniel Kahneman’s book “Thinking, fast and slow” is an excellent introduction to the topic.

Kahneman, winner of the Nobel Price for Economics in 2002, focuses mostly on his own work, but that covers a lot of ground. He starts with distinguishing between two different modes in which we make decisions, a fast and intuitive one, and a slow, more deliberate one. Then he explains how fast intuitions lead us astray in certain circumstances.

The human brain does not make very accurate statistical computations without deliberate effort. But often we don’t make such an effort. Instead, we use shortcuts. We substitute questions, extrapolate from available memories, and try to construct plausible and coherent stories. We tend to underestimate uncertainty, are influenced by the way questions are framed, and our intuition is skewed by irrelevant details.

Kahneman quotes and summarizes a large amount of studies that have been performed, in most cases with sample questions. He offers explanations for the results when available, and also points out where the limits of present understanding are. In the later parts of the book he elaborates on the relevance of these findings about the way humans make decision for economics. While I had previously come across a big part of the studies that he summarizes in the early chapters, the relation to economics had not been very clear to me, and I found this part enlightening. I now understand my problems trying to tell economists that humans do have inconsistent preferences.

The book introduces a lot of terminology, and at the end of each chapter the reader finds a few examples for how to use them in everyday situations. “He likes the project, so he thinks its costs are low and its benefits are high. Nice example of the affect heuristic.” “We are making an additional investment because we not want to admit failure. This is an instance of the sunk-cost fallacy.” Initially, I found these examples somewhat awkward. But awkward or not, they serve very well for the purpose of putting the terminology in context.

The book is well written, reads smoothly, is well organized, and thoroughly referenced. As a bonus, the appendix contains reprints of Kahneman’s two most influential papers that contain somewhat more details than the summary in the text. He narrates along the story of his own research projects and how they came into being which I found a little tiresome after he elaborated on the third dramatic insight that he had about his own cognitive bias. Or maybe I’m just jealous because a Nobel Prize winning insight in theoretical physics isn’t going to come by that way.

I have found this book very useful in my effort to understand myself and the world around me. I have only two complaints. One is that despite all the talk about the relevance of proper statistics, Kahneman does not mention the statistical significance of any of the results that he talks about. Now, this is all research which started two or three decades ago, so I have little doubt that the effects he talks about are indeed meanwhile well established, and, hey, he got a Nobel Prize after all. Yet, if it wasn’t for that I’d have to consider the possibility that some of these effects will vanish as statistical artifacts. Second, he does not at any time actually explain to the reader the basics of probability theory and Bayesian inference, though he uses it repeatedly. This, unfortunately, limits the usefulness of the book dramatically if you don’t already know how to compute probabilities. It is particularly bad when he gives a terribly vague explanation of correlation. Really, the book would have been so much better if it had at least an appendix with some of the relevant definitions and equations.

That having been said, if you know a little about statistics you will probably find, like I did, that you’ve learned to avoid at least some of the cognitive biases that deal with explicit ratios and percentages, and different ways to frame these questions. I’ve also found that when it comes to risks and losses my tolerance apparently does not agree with that of the majority of participants in the studies he quotes. Not sure why that is. Either way, whether or not you are subject to any specific bias that Kahneman writes about, the frequency by which they appear make them relevant to understand the way human society works, and they also offer a way to improve our decision making.

In summary, it’s a well-written and thoroughly useful book that is interesting for everybody with an interest in human decision-making and its shortcomings. I’d give this book four out of five stars.

Below are some passages that I marked that gave me something to think. This will give you a flavor what the book is about.

“A reliable way of making people believe in falsehoods is frequent repetition because familiarity is not easily distinguished from truth.”

“[T]he confidence that people experience is determined by the coherence of the story they manage to construct from available information. It is the consistency of the information that matters for a good story, not its completeness.”

“The world in our heads is not a precise replica of reality; our expectations about the frequency of events are distorted by the prevalence and emotional intensity of the messages to which we are exposed.”

“It is useful to remember […] that neglecting valid stereotypes inevitably results in suboptimal judgments. Resistance to stereotyping is a laudable moral position, but the simplistic idea that the resistance is cost-less is wrong.”

“A general limitation of the human mind is its imperfect ability to reconstruct past states of knowledge, or beliefs that have changed. Once you adopt a new view of the world (or any part of it), you immediately lose much of your ability to recall what you used to believe before your mind changed.”

“I have always believed that scientific research is another domain where a form of optimism is essential to success: I have yet to meet a successful scientist who lacks the ability to exaggerate the importance of what he or she is doing, and I believe that someone who lacks a delusional sense of significance will wilt in the fact of repeated experiences of multiple small failures and rare successes, the fate of most researchers.”

“The brains s of humans and other animals contain a mechanism that is designed to give priority to bad news.”

“Loss aversion is a powerful conservative force that favors minimal changes from the status quo in the lives of both institutions and individuals.”

“When it comes to rare probabilities, our mind is not designed to get things quite right. For the residents of a planet that maybe exposed to events no one has yet experienced, this is not good news.”

“We tend to make decisions as problems arise, even when we are specifically instructed to consider them jointly. We have neither the inclination not the mental resources to enforce consistency on our preferences, and our preferences are not magically set to be coherent, as they are in the rational-agent model.”

“The sunk-cost fallacy keeps people for too long in poor jobs, unhappy marriages, und unpromising research projects. I have often observed young scientists struggling to salvage a doomed project when they would be better advised to drop it and start a new one.”

“Although Humans are not irrational, they often need help to make more accurate judgments and better decisions, and in some cases policies and institutions can provide that help.”

Here is a link to the original article: http://backreaction.blogspot.mx/2012/08/book-review-thinking-fast-and-slow-by.html