Just Enough Research, 2e - notes mostly, rather than a review

 Just Enough Research, 2e by Erika Hall

ISBN: 978-1-917557-89-8

Bottom line: Concise suggestions. Very pleasant. Just Enough Research can be a great starting point for an individual or team to start incorporating design ideas in their workflows. 

I work as a Software Engineer during the day and, recently, we've got some spare time between projects. Using this time, we purchased a book pack from the publisher A Book Apart. This'll be a little different than my normal book posts because instead of a review, I'll be posting my notes. Maybe a summary and review on the top. Yeah, that sounds good.

Just Enough Research opens with a discussion of transportation options that are ubiquitous the world over; you've got by foot/wheelchair, bike, and automobile. Now, where does this leave the pinnacle of technology, the Segway? It doesn't. The Segway is an oft used example of where research would've been helpful.

Now, Just Enough Research is a pretty quick read. It defines research and then jumps in with some quick crash course suggestions to get you started. It claims, aptly so, to give you just enough information to be dangerous. I'm down for that. This book is for a non-designer but for someone who sees the merit in design. Attitude is everything, they say. Hall also talks about where Design can fit into an organization, types of resistance you might face.

The resources that Hall includes at the end are extensive and solid. 

My one issue with the book is the section about Agile. The definition of Agile varies on where you go but the Agile Manifesto (and who doesn't love a good manifesto) outlines four values and twelve principles for effective developing software. Boiled down, it pushes against the traditional attitude of software development that is lots of documentation and strict requirements and refocuses it on people who will be using the software. I could devote a separate blog post (and perhaps, I should) on how having an Agile mindset (as they say) jives perfectly with design work. Hall, however, doesn't seem to really be sold on it. Maybe that's cause she is a designer first and foremost, rather than a designer in the software space? It was a minor point and it doesn't detract from the book overall.

Speaking of overall, if you are in a position where design would be helpful; I would recommend this book. It's concise but not too superficial and, coupled with a helpful team, it can be a good launching point for great and collaborative work.

Notes:

This book aims to:

  • determine whether you're solving the right problem
  • figure out who in an organization is likely to tank your project
  • discover your best competitive advantages
  • learn how to convince your customers to care about the same things you do
  • identify small changes with a huge potential influence
  • see where your own blind spots and biases are preventing you from doing your best work
What is Research? Simple systematic inquiry. 
Personal research begins with a Google query and ends with Wikipedia. The knowledge already exists, you have to assess the credibility of your sources though.

Pure Research is carried out to create new human knowledge, whether to uncover new facts or fundamental principles. "Why do humans sleep?" It's based on observations and what-not. It's science.

Applied Research borrows ideas and techniques from pure research to serve a specific real-world goal; improving hospital care.

Design Research is a broad term. Focus on the people for whom you are designing a product.

What Research is not:
  • Research is not asking people what they like
    • "Like" is subjective and weak
    • Hate is like that too
  • Research is not about looking smart
  • Research is not about being proven right
  • Research is not better just because you have more data
    • Data is a tool with which we can analyze 
Everyone can and should do research.

Hall tells of her first design-agency job. The team was mixed and each brought different skills: the content strategist notice the vocabulary real people used; the developer has good questions about personal technology habits; the visual designer was into motorcycles. A research lead could be the one gathering lots of data to get the team to analyze or coordinating the team to do the research.

There are a bunch of different ways you could classify the type of research you are doing but it might be more worthwhile to focus on your goals and questions.

--------

Roles - these can be shared or done as a team, etc.
  • Author - plans and writes the study. This include the problem statement and questions, and the interview guide or test script. Ideally a team activity
  • Recruiter - screens potential participants and finds test subjects
  • Coordinator/Scheduler - plans meetings, etc.
  • Interviewer/Moderator - just what it says
  • Observer - useful for clients or available team members to watch the research in progress (but make sure that they do not influence the research itself)
  • Notetaker/Recorder - full-time person so the other people can focus on the research/interviewing
  • Analyst - reviews the gathered data to look for patterns/insights. This should be done by more than one person
  • Documentor - reports the findings once the study is complete
Objections:
Research doesn't "prove" anything; you aren't going to convince people so just work with their beliefs instead of against them.
"We don't have time/money/expertise/the infrastructure" - you don't need much of those things
"We need to be scientists" - you don't
The CEO is going to dictate what we do anyway - get a new job
One research methodology is superior - you need an appropriate method for your question
"We can find out everything in Beta" - why wait until time has been sunk 
"We already know the issue/users/app/problem inside and out" - this attitude breeds blind spots

Research requires collaboration
4 guiding values:
  1. Clarity and Definition - expressing and articulating thoughts clearly
  2. Accountability and Ownership
  3. Awareness and Respect
  4. Openness and Honesty 
Cover your bias
Blend in - people might clam up cause they know they're being watched
Checklist:
  • Phrase questions clearly
  • Set realistic expectations
  • Be prepared
  • Allow sufficient time for analysis
  • Make it memorable and motivating

------- The Process

Follow these steps:
  1. Define the problem
    1. Base your statement on a verb that indicates an outcome, such as "describe," "evaluate," or "identify" - not open-ended words like "understand" or "explore"
    2. A good research question is specific, actionable, and practical
      1. it's possible to answer the question
      2. it's possible (but not guaranteed) to arrive at an answer with a sufficient degree of confidence that you can base a decision on what you've learned 
  2. Select the approach
  3. Plan and prepare for the research
    1. Recruiting is important - it's like fishing, you've got to prepare and find some good helpful target audience representatives
  4. Collect the data
    1. Make sure you're organized
    2. Usability testing only gives you a portion of the story - it gives you an indication of if you're on the right track but it doesn't tell you if it will be successful in the marketplace
  5. Analyze the data
    1. There are lots of tools that can be used
    2. Get lots of people involved but only people who can contribute
    3. Guideline for a session:
      1. Summarize the goals and process of the research (What did you want to find out and what were the roles on your side?)
      2. Describe whom you spoke with and the circumstances
      3. Describe how you gathered the data
      4. Describe the types of analysis you will be doing
      5. pull out quotes and observations
      6. Group them and check for themes
      7. Summarize the findings
      8. Document in a shareable format
    4. Don't solution
    5. refrain from pulling out themes till all the data is spread out
  6. Report the results
--------Organizational Research

Stakeholder - those groups without whose support the organization would cease to exist:
  • Leaders - help you understand the overall company mission and vision and how the project fits into it
  • Managers - frequent be concerned with resource allocation and how your project affects their incentives, monetary to otherwise, and their ability to do the work
  • SME - provide essential background info
Read Paul Ford's essay "The Web is a Customer Service Medium" (https://www.ftrain.com/wwic)

Asking someone for input before you get started is a great way to move things along. Inquiry is flattery.

Hall goes into how to conduct a good interview - but I don't think I'll go into detail on that. 

I could see how there can be tension between "Agile" as Hall notes and Design/Research. By referring to it as "Agile", I don't think she really gets it and, I really don't see why there has to be tension. If the focus is on doing Just Enough Research, it seems like it should fit nicely into something like Scrum. 

-----User and Customer Research
"As a designer, you have an enormous, exciting responsibility. You define the human world, one object or system at a time. Every delightful and every frustrating artifact, every unseen algorithm that governs interactions, every policy constraining choices, exists because of a series of design decisions." uuuuggghhhh, puke.

Empathy! Yay empathy! 

User Research as distinguished from Usability Testing, we're talking about the tufy of humans in their culterual and social contexts. We want to learn about our target users as people in their habitual environments. It's very different from gathering opinions. 

Everything in Context:
  • Physical environment
  • Mental model
  • Habits
  • Relationships
Assumptions are insults

Dale Carnegie said, "You can close more business in two months by becoming interested in other people than you can in two years by trying to get people interested in you."

*** The first rule of user research: Never ask anyone what they want. ***

Your challenge as a researcher is to figure out how to get the information you need by asking the right questions and observing the right details. 

4 D's of Design Ethnography
  • Deep dive
  • Daily life
  • Data analysis
  • Drama! 
She goes more into how to do a good interview again.

Focus Groups: Just say No. They create an artificial environment that bears no resemblance to the context in which what you're designing would actually be used. It's research theater.

-----Competitive Research
Probably going to skip this cause we tend not to have to investigate competitors.

Brand Audit: 
  • Attributes - which characteristics do you want people inside and outside the company to associate with the brand or product? Which do you want to avoid?
  • Value proposition - what does your product or service offer that others do not? How does your brand communicate this?
  • Customer perspective - When you conduct ethnographic interviews with existing or potential customers, what associations do they have with your brand?
I suppose you could ask this question about yourself for the purposes of your Personal Brand

Huh, according to Hall, "the name is the single most important aspect of a brand." GaryVee, I think, disagrees. After all, Apple? what kind of name is that? The important part is the association people have with it. 

-----Evaluative Research

Testing how your design holds up

Heuristic Analysis - watch some users use the thing and then note how smooth/successfully they do it. Look at https://www.nngroup.com/articles/ten-usability-heuristics/ for further info

Usability testing - can save you from introducing unnecessary misery into the world (or have it associated with your brand) - https://www.nngroup.com/articles/usability-101-introduction-to-usability/

Cheap tests first, expensive tests later

Also, don't do it right before you're about to launch.

Hall then goes into describing how to conduct a Usability test; just use the link above, I think for more info. It's essentially like an interview but you are having the user use what it is you made: document, have a note taker, have specific "questions" or tasks or thigs

--------Analysis and models
Design truly starts at the stage where you have your million-post-its. Humans are pattern-recognition machines and if you work, collaboratively, then the clarity you grasp from those post-its will be shared

Affinity Diagrams
Write down observations
Create groups
Identify next steps
Create personas - https://www.youtube.com/watch?v=0jm8nnHqx80 
  • Design targets are not marketing targets. Market segments do not translate into archetypes
  • The user type with the highest value to your business may not be the one with the most value to the design process. If you design for the users with less expertise, you can often meet the needs of those with more
    • I don't think I agree with this; I mean, duh, but it sounds like wasted effort. 
  • A truly useful persona is the result of collaborative effort following firsthand user research.
  • You can create a vivid individual with just a few key details. 
Hall then goes on to describe qualities of a Persona which I'll list here but not get into detail. They are self-explanatory and the idea is that this Persona is a distinct human-like image. "What would Beth do?" when referring to a design question. You can then refer to your persona named Beth. These are particularly helpful if you can't talk to your users.
  • Name
  • Photo
  • Demographics
  • Role
  • Quote - https://www.youtube.com/watch?v=kdemFfbS5H0&t=72s
  • Goals
  • Behaviors and habits
  • Skills and capabilities
  • Environment
  • Relationships
  • Scenarios
Mental Models - we all keep mental models to help describe the world. If a band stops playing and says, "Thank you and have a good night," it means they are done, for example. In design, "intuitive" is a synonym for "matches the user's mental model" - for more info check out https://rosenfeldmedia.com/books/mental-models/ (and by check out I mean from a library or pirate it - yeesh $40 for an ebook - what is this a videogame?!)

----- Surveys
 Most misunderstood and misused. They frequently blend qualitative and quantitative questions which shouldn't happen cause those are different research methods and processes (see earlier tip - pick your research method)
Ha! -> "If you ever think to yourself, "Well, a survey isn't really the right way to make this critical decision, but the CEO really wants to run one. What's the worst that can happen?" Brexit...

Surveys are tempting cause they are easy and talking to people is hard. A bad survey won't tell you it's bad; a bad interview will be hella uncomfortable. 

Oh dear, the image for the Sample Size Calculator doesn't show up in my versi---- oh it does, just not in nightmode cause the formula is in black. In any case, Google for a sample size calculator.

yadda yadda yadda, the more surveys and responses you get back the better. If you don't get enough and you base your decisions on that, you're running a serious risk of not representing sufficient people.

If you still want to conduct a survey, treat it like any other design tool. Keep in my your specific research question. Keep your biases in check that may reveal themselves in the questions and answers that you provide. Analyze the data rather than blindly consume. 

Check out the Likert Scale. I wonder if the Likert Scale would be a better tool when it comes to the Agile Health Assessments rather than a 5 point scale.

Net Promoter Score - "an operating management tool" that is coerced into being a "research" tool 

Quantitative vs Qualitative surveys - "Unlike for quantitative surveys, qualitative survey metrics are rarely representative for the whole target audience; instead, they represent the opinions of the respondents....Unless you use sound statistics tools, you cannot say whether these results are the results of noise or sample selection, as opposed to truly reflecting the attitudes of your whole user population."  -Susan Farrell "28 Tips for Creating Great Qualitative Surveys"

-------------------Analytics 

You will miss things when you go live but that's ok.
A user is said to convert any time they take a measurable action you've defined as a goal of the site: sign up, buy now, make a reservation

Some websites are focused solely on a single action.

Analytics refers to the collection and analysis of data on the actual usage of a website or application - or any quantifiable system - to understand how people are using it. Based on data from analytics, you can identify areas where your website is not as effective as you'd like it to be.

See Clickstreams

Keep in mind two things when considering analytics: goals and learning. Without these, you might have a bunch of useless data or vanity metrics. Recall Crux's spiel about what they looked at. 

Hall recommends Google Analytics - GaryVee says GA are stupid.

Again, this is data. Data must be analyzed to be of any use.

Try Split Testing - patience and confidence are paramount to this testing. Given a large enough sample size, you can (for the most part) trust the results.  
Try to consider the effect of this though; if you aren't careful, you might be disrupting your site more than helping. Also, this is an incremental process not a source of high-level strategic guidance: search engine landing page vs global navigation.

Note, you can only do so much optimizing without an existing design system. 

Comments

Popular Posts