Lean on Research
Hustle
0:00
0:00

Full episode transcript -

0:6

Hey, welcome to hustle a podcast about products. I'm Rick Messer, and I'm here with my co host, Anthony. Arm adores. Thanks for having me. Yeah, my my my guess he was never here. Anthony. Just kidding. We both kind of do this thing together, and we're both here today with Cem. Interesting thoughts around research.

0:27

Yeah, if you hear ah sawing happening, it's renovation work that's happening downstairs. We decided to just go ahead and

0:36

record. Yeah, it'd just be like this natural part of this episode, right? So the thing that we've been thinking about lately amongst many, many other things is actually really hard to choose. A chop topic today Is there so much going on? But, um, something that has been a subject at hand, I would say, like just with the whole, like, us lean U X and agile development methodology and all all this sort of thing, all these sorts of things. There's a portion of building digital products and I suppose any type of product that needs to go into the work that's done and that is research research obviously validates and invalidates ideas. Let's some people know,

like, if this is a good idea or not, so it's a great place to start. But I think that the problem with research is knowing exactly how much of it to do or not do so. Sometimes it gets to the prom with not having enough is that you're making a lot of assumptions. You're making decisions about a design or about a product that is being used by a lot of users, that you have no idea if that just came out of your brain, um, and is in a lot of other people's brains or not. The alternate side of that is you could pour a lot of time and Resource is into research and do a lot of theoretical stuff to to insist. Hey, users are going to think this. Then you build it, find out not the case. Yeah, um, or you just spend way too much money, time and energy on the research process and just like sometimes you just gotta go for it. But when you know, like how how are you supposed to know

2:23

it's hard? Um, I am no expert at research, but I think you and I would both agree that the most important kind of researches, user research. I think it depends on who you're dealing with those. You know, Um, about 1/3 of our clients are companies that have a validated product in the marketplace. And to me, I think what really matters is how are the people using it? What is the data show then you got on the far other end of the spectrum? You have early stage start ups who are sometimes creating something because they want something. And they have. There's no users yet. Yeah, so I don't know,

like on that side, I think it's important to get the ideas and the ideas whether there, you know, conceptual designs or paper prototypes are functioning prototypes in front of people as soon as possible. Because I don't really know if you really understand anything until you see someone using the product. At least at the at

3:26

least that, Yeah, I totally agree with that, uh, and that this is just such a tough subject, because you want to just get something out there, you know? But at some point, you don't do any research at all. I mean, that just sort of feels like irresponsible or something. You know, like uh huh. It's hard being like a designer and somebody that is an expert in user experience or, you know, interaction, design and that sort of thing.

But then, you know, sort of standing on top of your design decisions. And even if you have, like a really good idea or reason for why you made that choice, you still could be proved wrong. Like once, once the products, like actually in the the user's hands.

4:10

Um, yeah, they may even use the product completely different.

4:13

Yeah, that's something unexpected. Yeah, that's that's something I love about. Just getting a product out there, too, with real users is when they like, hack the system, you know, to get it to work the way that they want in a way that you never even thought of it. It's like, really cool to see people just sort of like lean into that. Like, Okay, users were using it this way instead of resisting users, you know, using your products for something. Oh, our feet sort of creating a feature by hacking hit. That's an interesting way to go to

4:42

I don't know. I don't know those of you. They're listening how many you know what environments you come from. But I think the the hardest thing for me to navigate on the business side is howto how to answer questions when they come up from perspective clients or current clients. When they asked very direct questions about what is your wish, your research process look like what can we expect? And, you know, that's from the client side. And then and then you have questions come up from the design team like, Oh, what are we expected to do? Like, how do you know what is what is the client expect? What should we run? There's no real solid answer to this, Um,

in my experience, before doing mobile products, usually someone would hire an agency and expect the agency to really drive everything all the research, create the business and use cases designed the work, you know, code it, test it, ship it handed over. Um, those of you that we're working with product companies like ours are probably working with product managers in which we discuss in a previous episode. A lot of these product managers come from U ex backgrounds, and in their organizations they are responsible for the strategy, the user research, talking to customers, user testing and all that. So it's hard for us because one of the reasons why we're able to work product companies so well is because we have every slim,

lean design team. We have one or two designers and design leader that's gonna work on your project, and we keep the price point in a way that makes sense of these companies. But we don't have, like, a user research discipline or use a research on that team. It would be ideal, arguably ideal tohave it. But then it would put us in a position where these companies wouldn't be able to afford to hire us. So to some extent some of these customers that we work with we have to blindly trust that they're testing the work that we're doing because we're getting the feedback from them now on these earlier state startups, I think the need is quite different because Joe early stage start up guy, um, building a new business needs a lot of help to make sure that he's building the right company, the right service, the right products and in the right order. So you know, I don't know. I, um I don't know howto really tackle that, You know,

7:14

that's a good point. It's a good, uh, it's just that, well, it's not good. It's it's just a tough spot to be in. Um mmm. So another piece of this puzzle is like, Who's the one doing the research? Like how involved, if at all, should design be in in the research? Because who, like I'm of the opinion that, like, I'm almost of the opinion that design should have,

like, very little if nothing to do with research, because they certainly to be abstracted from the process so that they could just sort of be, like, used the fax to judge what they learned, However, like looking at like recorded user interviews, as as they use the products of useful to just sort of understand the emotions, if any, that that the user use their reactions. I guess emotions maybe

8:3

not agree that you know all the people you know. A lot of the recent articles ever had a lot of the things that I've been thinking about. To your point exactly. I think maybe the most important thing for us to do is designers is actually watch someone use our work because, you know, we also talked about this last week. I think a lot of times you need a design from your gut, and but you don't really You won't really understand how someone else is gonna react to it until it's so you can see them use it or see their react visual their facial reactions.

8:36

Or that might completely change your opinion of that design choice or whatever. Like seeing that, you know, you could be really stoked about it or whatever, But then, seeing it break it, just it really, you know, not in like, a bad way. It's sort of like encouraging like, Oh, man, if we just did it, put this button up at the top like people wouldn't have such a big problem finding it whatever, you know? Yeah,

8:59

I think that's, um maybe it's informally baked into what we d'oh. I mean way have lots of people on different teams looking at other teams work. But even if a client, even if we don't get to see their clients user tests, you know, I think one thing that we could do right now that would make us all. You know, a 1 to 10% better would probably be to have some way of, you know, seeing strangers use our work.

9:26

You know, if they strangers, right? Not not the guys says next, right? I mean, that's what I mean that there is. Okay, first of all, last episode we had with Sam Capillary and, um, Jim Jordan. I thought it was pretty interesting that we made the point that, like any design decision that you make, is actually based on, like, having learned something about human behavior in the past.

And that is, in a sense, uh, making a design decision based on some sort of user research, albeit extremely informal. But then there's also, you know, just like a step above that is like, Hey, man, what do you think of this? You know, somebody was fresh eyes, that sort of thing like that's not use a research, but it's it's taking a step into you know, what you're trying to accomplish with with user research.

But then there's like, just you know, that the stark contrast of that, which is like, Look, we've conducted this research in labs with PTL in like the mouse ate the cheese, and this one didn't eat the cheese. This mouse got

10:35

chickenpox. For some reason, the data is showing that people are holding their

10:39

mobile phones with their hands where we were confounded by this. Yeah, so, you know, and just like research like that that comes across like, I don't know how how I feel about it anymore. Like when I get, like, a big research, like packet and stuff, it just sort of seems like wasteful to me. Like in some way I could be wrong, but it just feels wasteful.

11:3

I by no means when a discount, the work that researchers do. But I agree, I think. And that's also a reason why I think the expectations of agencies air changed. Because if you have a limited, you know, design runway, what do you want to spend your you want to spend, you know, 75% of your time on research and not really get anywhere with design? Or do you wanna spend the majority of your time on design with just enough research and get something that can be validated?

11:28

Yeah, Like just enough research. That's like a book a book title rivals. So, yeah, it was like a book

11:33

apart. Yeah, I haven't read it yet that I bought it for the library. And it's, I mean, just the title of it alone, I think aligns with the way that I think about it. You know, I think most of our customers would probably be really pissed off if we spent the most of our time during research instead of like, getting to the world.

11:49

So many other aspects of the work that we need to do. Like besides, like doing like a huge research portion of it, Like, I guess, I guess. You know, you need to sort of separate research from user testing because you have, like, researchers sort of like a bigger umbrella user testing a certain beneath beneath you know, because, like with with, ah, research, you also have things like demographics that that our research and she geography goes into that there will be some things like user personas that are constructed in these usual personas represent thes, you know, five groups of people and types of users that we're trying to build this product for.

And this one, Todd, you know, from accounting is using this software and that type of user persona doesn't really, you know, jive with this type of use case. I'm like, how much does this user use this type of feature? Low on this. 80%. This feature, 20%. That's, you know, it gets really broken down. And while that is,

it's actually kind of fascinating to look at. But I just don't know that you can make a ton of decisions product decisions that are, you know, ultimately valuable from based on those because it also, like, almost boxes you in from a lot of light not trying a lot of things to, you know, like discovering something outside of the outside of those the projected users that are going to be using this. Yeah, this software, you

13:22

think about it. I mean, think about how far you could get with design and four weeks versus spitting four weeks on, You know, some documents that may never really be looked at me. It sucks that some of this stuff doesn't get looked at, but I've definitely had clients in the past that told me like Whoa, whoa, Hold on. Now we know our business. Don't go creating these like high, you know, high fidelity personas and and research documents. Sure, you, like, do the research a bit like, but don't don't make it so formal, like we already know our business,

13:55

you know? Yeah, I would I would I would want I would think to that possibly client like that is a little bit smaller of a company, like, maybe not an enterprise company because, like when you have users and like the millions and stuff like, I think like there's a lot of it a lot at stake. And so they want to hire research to, sort of, like, make them feel better about the decisions that they're making so that if and when it win or if it fails or something. Look, we made this decision based on this research back, and that's you know that's valid. So you know, there's research, and then there's like user testing specifically is like under the umbrella of of, you know, the umbrella of research. But I guess I'm then there's

14:42

real data, right? Like does that check outflow work

14:46

like when it's actually launched in? You're looking at the report of like

14:51

users does design option a improved conversion more than design Option B.

14:57

The only thing that you can learn from that stuff, though mostly is that like it's it's working or it's not working. If it's not working, then the answer is still not clear. Like it's like, Why isn't Why isn't it working? You know, you still have to, like, figure that that part out and that you always hear like, Oh, you know, we just swapped the hero image with the sign of form and then like that, solved like, you know, it changed our click through rate from like, you know,

3% to like 85% or something like that. And I think those things do happen. But it's a little bit day to day, like when you're brought with the decision to change something that's not working. I think it's less clear than that s O. I think, you know, ultimately, you know where we started and where we are now in this conversation is that is valuable. But it's just not everything, you know.

15:52

Yeah, I mean, I think you first have to identify. Is there someone on my team that is that is already should be responsible for doing this. If it's your client than you know, that client is a product manager design research team. Get, get, Get the research that you need to, you know, to skim and make a decision on. Don't go on double efforts just because you're an agency and you wanna, you know, prove your expert. I mean, we were you know, the inspiration for this episode was really knowing where to lean.

I think you know, B. B. You know, it's not always a phase at the beginning of the project, right that you do. It's like all right, this two weeks, what do we need to learn? Designed, test, validate? Or maybe it happens sporadically, knowing when to at value through design, execution, knowing when dad value through some kind of some former research knowing who on your team is responsible for?

16:50

Yeah, just sort of. It's weird because it's, like, so important, but it's almost like they got level things like I I think I got enough here, you know, like, let's just let's just dive into this and that's I don't know, maybe that's wrong. Um, but the other thing is that that I think is interesting. I mean, like, I mean, how many? How many clients like?

That's part of, like, the first, a few things that they asked if they ask you at all. Like, what is these air testing look quite like? Are they assuming that, like, we have this, like, super stark discipline that we that we, you know, plan down the tea or do they not care? Or do they? Yeah, value it

17:27

or most of the most people that we talk with. Don't ask those questions. I think we tend to drive those questions like Okay, how much exposure we're gonna have to your product. Team your users. Will you let us talk to your users? Will you let us see data if we ask for it, we let us do our own tests. Some people are very open to that. Especially when you're working with teams really about value, that kind of work. Some teams aren't and they, you know, tell us very directly like No, no, no, no.

We're gonna own that will provide, You know, like our product manager will drive, drive the thing there. There have been there is a couple of prospects recently. Enterprise clients that asked very for very detailed responses about what are our what our research process looks like in general and honestly, the the the answer is we don't You know. I mean, you don't You know, if you hire fun size, you're not You're not getting a formal like you ex research team. You're getting a bunch of people that have butt load of experience with a wide range of, you know, hundreds of products with def all different kinds of companies that are gonna, you know, make smart decisions and collaborate on on the,

you know, the decision points with the client. We're not a research team where you know where we don't come in thing, we're gonna own everything, and we're gonna provide you the research in the plan, and then we're just gonna do it. It's more collaborative. So for those people that ask us that we're not the right fit because we're not gonna we're not gonna excel in that environment, I think I mean, I think in those situations, if someone really needed to do formal research, we would weigh would have we would work with 1/3 party vendor that

19:11

specializes in that. Yeah, uh, And then we could judge. You know how how well that informs the works. I don't know, man. I'm split on all the time. Like it's like, I I wanted to inform our decisions a lot more. But then when it comes up, it comes in. I feel like we're sort of like, Ah, I start questioning the validity of it. I'm like how, like,

impartial was this user led down this path or whatever, like, you know, and then I want to, like, test the objectivity of the test. Sort of to make sure, because, I don't know, I have a hard time. Just like, oh, research said Okay, You know, I just I just have a hard time just believing it without questioning

19:58

it. I don't I don't like those user research labs with a facilitator, because what I've seen more often than oddest is actually someone being led

20:7

its leading, right? Yeah. Um,

20:9

so I'd rather just get video feeds of someone using it without a facilitator,

20:13

and Yeah, but I saw I always get really hung up on this. This is actually a good segue. Way into something I wanted to bring up, Um, two things. So first off I attended a webinar. That Quark Wimberley and envisioned kind of put on is like an envision blogged thing or whatever, and you can kind of watch it live and the weapon are still find it on in visions. Blawg. It's called U Ex Playbook Real World User exercises, and he goes through like how he constructs user personas and how he uses that to inform the US is pretty cool. I really liked it because it was like a good balance of, like, using informed, like decisions to to to make design decisions. But it's not like super super,

you know, tedious or like to to specific. It was probably good balance. I liked it. But, um, halfway, you know, at the end, and I kind of had this thing that that have been bugging me about user research, user testing and whatnot. Um, last year we went Thio design conference in Utah Cold front. And while we're there, there was a talk outclassed to get this guy's name. I can't remember it right now,

but he did a talk on user testing, and he wrote this really interesting thing called the Hawthorne Effect. So what this was was like back in, like, I don't know. I want to say, like, the twenties or thirties. There were these factory workers, and they're trying to improve productivity. And, uh, they did all these tests to change the lighting. Maybe that'll make him work better. Well, uh,

you know, we'll start the shift later. Maybe that'll make him work better or whatever, and they start doing all these things. And what they found was that no matter what they did, everybody worked better trying to examine it. Like what? Like, wow, all of these things work. This is great. But then essentially think through like talking to one of the workers or whatever they're like asking them like bio productivity seems to be going up and up, like, What's the deal? They're like, Well,

there's all these people in lab coats like watching us and like writing stuff on a clipboard. So we just thought we were all about to be fire eso like they were. They were no matter what they did at the lighting, like they didn't care about that. They're just like someone's watching us. I'm gonna behave differently. So the understanding is that the Hawthorne effect. Is that the very nature of doing a test? The user, Tessa, all influences the results. Like, no matter what, Like you can't get real data until you have really users. Um so, like like he used the the idea of,

like, just the actually performing that test in any way sort of alters the results. Like if you were to take a tire gauge to test the P S, I and your tire, you have to put you have to let a little bit of air out in order to even test it. So that'll tell you what the P s I is now that you've let the air out a little bit. But it doesn't give you a true reading because before you let that air out anyway, So it was just like basically, user testing affects the results of the user experience s. So how do we even like I think of, you know, like, how do we even, like, use this in a valid way? And I actually asked that question in the Web in our cause is just killing me.

It's like, Oh, my gosh, like we can't trust anything, you know, and, um, you know, Clark had a really good answer for me. He's like, Yeah, it's take with a grain of salt is like, Yeah, not it's not, You know, it's not totally 100% conclusive,

you know, But you can learn some things from it. I thought that was pretty really good answer, and it really could, just out of no attitude for how to use it, because it is helpful and informative. But, yeah, sometimes it's too much

23:52

of that. Was your story to that? This would be more funny. Um, when I went to go work it Ever note, my brother was already working there. He's like, Dude, one of the first things that he showed me that first week was this video of a user test that they did. So the product was skitch. And for those of you that have used Skitch, know that it's like there's an iPad and an iPhone

24:13

version. Skitch

24:14

not sketch, Yeah, sketched by Evernote annotation tool. One of the features is a bill to take a photo of something and then annotate on top of it. So instead of the normal like online user testing dot com stuff, they were doing at the time. Previously, they were actually using labs. And so they had iPads set upon these labs that were mounted on stands for the people that were coming to test it to use the AP. And they rode out the scripts of There was no facilitator. These people were descend there by themselves with these scripts and, well, I guess they were trying t test time to execute tasks. So there was one guy and he has a script in front of him. The script said, Okay, take a photo of your chair,

then. Ah, annotate and arrow on top of it and then email it to your friend. The guy was on the audio recording for 30 minutes own, like, could not figure out how to take a photo of the chair. And he got started getting so angry that he was screaming,

25:19

Where's the chair? Where is the chair? I don't understand. It turns out the iPad was glued o or, like, fixed to the couldn't pick the iPad up. Yeah, and so we made all these. So that test was sort of a lot. They

25:34

made all these means.

25:35

Insurance says we're man That's great. Uh, useless. Yeah, that that guy probably felt like he was in this, like, social or what? Like experiment. Like where he's locked in this room and like, it's a puzzle, man. Yeah. It's impossible to ask yourself things like so Yeah, that's self. I love it. Yeah, great

26:0

assault. I think that's a That's a good topic. I mean, a good point and yeah, to the research. Just don't You know, I also just don't if you don't know how to do that stuff, don't pretend that you D'oh, Just tell your client or stakeholder I don't have to do this looming. Should talk to someone that does.

26:15

Yeah, we should, you know, if that's important to you. Yeah, We should hurt Hire it out or something. Yeah, cool. Well, I think we're about out of time for today. Um, Anthony, since we don't have a guest today, it's just you and me. How can people find you on Twitter and what not

26:34

only anyone can find me on Twitter at Mangwon. M e N T w a n I'd love to hear, and he thought she has about fun size or the podcast or talk shit about Rick. Whatever.

26:46

Yeah, definitely. Definitely. Please, no more conversation about hot dogs and sandwiches. I can't. I just can't. Everybody knows it's not a sandwich. It's over. If you if you want to tweet to me about any of that. I'm at Rick Messer at Rick Messer, and we also have, like, have fun size and at Hustle cast, I want to do the tweets.

27:10

That would be cool. Would be great to get some local guests on the show, more local guests that could come into the studio and record with us. So you you have some interesting topic ideas shooting our way and let us know

27:22

it's easier than Skype buddies. Sounds. Sounds better to you. Okay, cool. Thanks, everybody things. Episode is brought to you by the Iron Yard Theon Yard in Austin is now offering a 12 week intensive program in user interface design. Theon yard will teach you the tools and skills you need to become a professional interface designer and then help me find a job. If you're interested in launching a new career in tech and design, visit their website. The iron dot com scholarships are available for the summer semester wearing user interface design at the Iron Yard. Life's too short for the wrong career. Hustle is brought to you by fun sized Digital Product Design Agency in Austin, Texas, and creates delightful, innovative products for mobile Web and visit us on Twitter at fun size or visit our website of fun sized Psycho.

powered by SmashNotes