Everyone Is Lying To You: The Value of User Testing (feat. Ajay Waghray)
Hustle
0:00
0:00

Full episode transcript -

0:9

tell a story. Yeah. Wait. Welcome. This the hustle. A podcast. My fun size about mobile product design. I'm your host. Rick Messer, product designer at front size. I'm also joined by Anthony Armendariz, co founder and experienced director. Fun size. Today's episode is sponsored by bench benches Thea Online bookkeeping service that pairs you with software and a professional bookkeeper to manage your books. If you're looking for help with your looks, check out bench Doc. Oh,

that's bench dot c e o. So today on as a guest, we have a J wah grade. It's Ajay Wall Gray. Yeah, man. Welcome. Thanks. So, J is a product manager at home. Away. Um, can you give us just, ah, little bit of your background what you do over home?

1:6

Away? Yes, on the product manager for our mobile applications. So the IOS application the android application that we have it home away. Ah, And so I've been there for almost three years, and essentially what I do is is create the experiences that you see in those in those mobile applications Product management. You know, you focus on essentially the why and the what So you're really thinking about what the product is supposed to do, what it looks like, how it feels. And then why you would do that? So essentially figuring out why we would go build feature X. And why would go do that first over something else. That's good. Yeah. So Ah,

and home way for those you don't know is ah, marketplace recon, fine homes for when you go on vacation, not just one room. You get the whole house for the whole family on the really nice houses in, like wiII and Canada Place

2:0

like that? Yeah, that's awesome. We we love home away. They're right down the road to baste, Baste It is all stand like you're total homebase or is it based out of another city? And this is just a yeah headquarters air here in Austin. But we're

2:15

a global company. We've made acquisitions around the world, so you know, we have officers everywhere,

2:20

that's for sure. I mean, the New York were down the street, but I

2:23

didn't know if it was like headquarters here in Austin. Yeah, we're right across in the whole foods, which is where I spend a good

2:29

portion. I would It's just like it is just barely out of walking distance

2:35

from from our office over to that intersection is dangerous being that close. It's also awesome,

2:40

but it's really cool. Yeah, I'm cool. So we wanted to know the man. They were just so many things we wanted to talk to you. Way had a kind of narrow down and choose from a shorter list. So we can do this for six hours. Oh, yeah, like a special six hours.

3:0

I have a job. A lot of people joke like I walk around the think and so, you know, we walk around the office and do you do anything I wouldn't do? Yeah, I do. But I'm just like trying to clear my

3:12

like, I'm doing it right. I take the paperwork to the engineers on have people skills, you know, they're nice.

3:22

There's a nice little graphic of, like, product management of like, You know what you wish you did. And like what your mom thinks you do, I think what you actually do is, like, run around like a chicken with your head. And I can't say that it doesn't always sometimes feel like Zach. You're it. Sometimes you do feel like that guy. Take this thing.

3:43

Yeah, school. Yeah, I guess. You know, we had to pick one sort of vertical of the topics. We went with user testing very much because we're interested to hear about the way home a way, does it? Maybe you could give us a break down. What? What is home? Always approach to user testing, at least on the product

4:4

that you work on there. Yeah. So, uh, you know, I think it helps to take a step back and take a look at what people, how people are and how people think about using products using software and how people think about giving other entities opinions. And the thing I always think about when it comes to this sort of stuff is that everybody in some way is lying to you. Andi Reason I say that is because, you know, there's all these things that happened in modern day life that you're having to kind of filter out or tryingto tryingto essentially cut through to get to the truth. And I'll give you an example like there's a show I watch. I don't know if you guys watch it. Comedians in cars, getting coffee. Thanks. Seinfeld. Yeah. Seinfeld has succeeded in making another show about nothing where he gets to do exactly what he wants, which is amazing. And And there's this episode where you had Aziz Ansari on the show.

5:0

I saw that. You see that big bust? Yeah.

5:2

He drives around the big boss, which is hilarious in itself. And then they get to the diner and they're sitting there. And at one point, they start talking about, you know, essentially, you know how, like when the waiter or waitress comes over and they're like, Hey, how's the food? Has everything tasting over. Everybody

5:19

always says, Yeah, it's great. I like when you're checking out the grocery stores. You find everything all right? Yeah. I didn't even think about it. Yeah. Yeah. And they're

5:27

like, All right, well, let us know if you need anything else, right, and that's it. But nobody ever really says like

5:33

lot, actually, Yeah, it's a lie, right? And

5:37

it's a straight up lie, So, like, but the reality of it, at some point, they start talking about Well, yeah, I really like, uh, you know, there's usually some sort of problem with right, and so At some point in the episode, the waiter comes over and they're like, so has everything in there. Oh, good. Thanks.

And then Aziz Ansari is like, I thought you were gonna tell him what you actually think he was like, Oh, I didn't even think about it. Yeah. So what do you actually think about the food? Well, the pope feels a little runny. Blueberries air weird. The coffee's pretty good. And then he asked, What did you do? You even think it's worth your time telling him that it was like, No, no, no. So there's there's the reality of things. And then there's the things that people tell you, and and that's really I think what you run up against when you think about

6:23

user testing. Yeah, I worry, too, with user testing that, you know, they don't They're worried about performing kind of on, and they want you to feel like they don't want to hurt your feelings, or they want you to feel like you did a good job on it, or they want to feel like they did a good job. Did you find that section? Uh, okay. And they're like Well, uh, not really? But they're like, yeah, I got there. It's fine. Especially,

6:50

I mean, it depends on it. Kind of depends on a lot of ways. Unlike what you're working on to its like your passion project, for example, Or, you know, a project that's just starting to get off the ground. They really believe in you. They're going to tell you the things that you in a lot of ways want to hear. But the reality could be very different. Could be it could be awesome. There are plenty of people out there will tell you the truth, but more often than not theirs, you're still missing the picture. So I think the way that we think about it is is if you have both qualitative data which comes from interviews and things like that and quantitative data about how your experience is doing with those

7:28

users or your customers with real users. Yeah,

7:31

yeah. I mean, ideally, if you can get that, um, putting those through two things together and understanding them together is really where you start to get at the truth. So if you can see in your data that hey, you know somebody's not using our or something's not converting through a checkout flow, for example, or somebody's not quite engaging with our application as we'd like, especially in this particular area. Right Then you can take that and say, Okay, well, I need to be able to observe somebody actually using the application, whether it's more formal in, like a focus group or or ah, usability study. Or it's more informal where you invite them in and you start essentially watching them over there.

8:12

Does home only do that to the invite users into the actual Homeway office and, like, kind of do a focus group and stuff.

8:18

Yeah, we've done that. The that particular environment tends to be a little bit more informal, most because you know these air customers that you've never met before or users that you've met before and we'll bring them in and allow them start using in either a new flow that we're working on or an existing product. I actually did this with a payment flow that we had where we couldn't figure out why people weren't using it. I mean, the use of traits were way lower than the actual number of people in the application, so we brought somebody in and I was just actually kind of sitting in the back, asking them to do specific tasks and not really giving too much guardians and just kind of wanted to see you. What actually happens when they use his product? And so I was watching her. I was watching her use the application, and at one point she's calculating this quote and then is like, Oh, and then she switches over to a calculator and I was like, uh,

9:16

why did you do that? Exactly? And she was like, Well, this application doesn't

9:21

pull from my rates exactly like it pulls from some of my rates, but not all of it. And like that, that light bulb went off like, Oh, that's why people aren't using this thing because we thought we did the work. We thought we, you know, made these calculations right. We thought the interface is great. So, like white people using it, and it just turned out to be this one crazy thing that

9:45

so So she was Ah, like what? Rates like her rates as the person who owns a home that was on home way and it was missing like one extra little variable that she normally puts into our pricing or something.

9:58

Right? This is somebody that owns a property on home, away. And their rates are essentially what? The rate of the for the rental of the properties, Like the amount

10:6

of property. So what is it like? How else would you have found that other than

10:11

having someone in there? Yeah. And that's that's, like a really nice combination of Well, okay, so we knew from data that this person had this problem with this particular part of the application for a number of people dead, and then now we can sit in front of somebody and say, Okay, well, why do you have a problem with this particular piece?

10:29

Because the company could have spent money developing new prototypes for new design. Not with yes. And wasting that effort, time or money before really

10:37

getting in the court. Right, Right. You can really waste a lot of time. Not without that kind of information on. Then you start thinking about Well, okay, so how do we fix this? And so, um, that that when you're actually looking at a solve, I think it starts to get a bit more interesting. And so when you start looking at solving something product wise with you, a new user interface or a tweak to an existing one or whatever. Ah, a lot of people go to usability studies or the put like a clickable prototype in front of somebody or a or a handheld prototype that you can have on somebody's device and actually watch people use it and gather that feedback. Tweak the interface and go back.

11:20

Think before you go and have a couple questions about stuff you just said. The 1st 1 is the way you're describing. Testing a user flow in The context you described is awesome. I'm really curious. How do you test the more subjective things like which design is more successful? Which shade of blue is more successful because I'm, you know, working in a product company myself When I worked it ever known designers. We just spend their wheels day after day, week after week, month after month, just turning designs around, and people will look at it internally and say, I like that it's not quite right, and then we keep Iterating. We're making decisions based on how we felt about it, right?

So how do you remove that? Like I like this and figure out what you know, right? And in, How do you balance it out? Because not everyone is obviously gonna like the same design. Yeah, that's a

12:9

good question. You know, we started initially, you know, a lot of cases. We're doing a lot of usability testing on new interfaces to get feedback from people pretty quickly on that doesn't necessarily mean you had to pull somebody into the office and videotape them. There's some other really nice tools you can use, like user testing dot com. Throw something into a test, you get results back an hour later. There's an ad for user testing that come, usually testing. That s o er. So those types of tools do a pretty good job of getting your feedback pretty quickly. Quote a qualitative feedback. But what you really need, I think to settle those debates is quantitative feedback and that you get for maybe testing.

Yeah, and we've been doing a lot of testing on the Web sites lately. We are starting to do more mobile a be testing on. So we're really curious as to how a lot of that's gonna play out, but it's been wildly successful in our websites. We've been able to settle a lot of debates and really push up our conversion numbers. I think that's really the true wayto to really start getting at.

13:8

What are the things that you can't argue with? The results? Yeah, that's Mike saw presentation that Might Montero did. It was about it was called 13 mistakes that designers make in meetings in 11 of these 2 13 things he's talking about. When a designer is presenting work, it's usually feels to a client like it's an emotional thing like, this is my work that I did for you. Yeah, I love it. Please, won't you love it, right? And so he's saying, like Never do that. Let go with the actual data before you present that that data have the baby testing results. Here's the design.

We designed it and test it. We may have had for other ones, but this is the one that was the most successful and why, right? And then you just take a one hour meeting and make it a 15

13:49

minute meeting. Correct? Yeah, absolutely. Settles all kinds of debates like, in fact, one of our android engineers who I think you guys know. Hey and I have been talking a lot about the side menu on Dhe. There's a drawer. Yeah, and there's a good ah, video by Luke Ra Bless Qi about the hamburger menu. Uh, there's a segment on one of his videos that he did at Google, and so they were talking about it. And ah, Siri's of companies did a whole bunch of tests around the hamburger man,

you and they did the small little variations to just kind of see how their users would react to that particular menu. And they found that these tiny little variations would get him 13% more usage or 22% more usage. It was small stuff, like just putting an outline around those three lines. That hamburger that's fascinating. Writing in that outline got 20% but that's one specific organisation. They also should booking dot com in that video, and they ran the same test. No impact. So it's really you're not only with a B testing, you're not just looking at like Well, what is that? What is that all of humanity think about this particular interface to this particular sign. It's what my users that are using this thing on a pretty regular basis, whether their occasional users or active users. What do they think about this thing?

And are those results statistically significant enough for us to be able to make a conclusion based on that information? And that's really like That's really the power of it is You get to settle those things, but you get to settle those things within the context of whatever you specifically are doing and just Oh yeah, Facebook did this. We're

15:34

gonna do that's That's how that is user testing for a lot of a lot of places. I feel like what we're guilty on E. Because you assume that a certain amount of research went into why it certainly made it to Facebook surely went teaser testing. So now our solution has all the benefit of Facebook's user.

15:55

Yeah, I want to get that video because I watched it and I was like Man, oh man, we made all these assumptions because I think one of the original stories about Facebook's little hamburger menu implementation is that they use it for a technical reason because back then they were still doing a hybrid app. Where was a native shell with web views, and they couldn't get around this one technical problem. So they solved it by changing the interface. That's where it came from. That's where it came from. Pretty much Everybody started using it. And you're like,

16:23

Oh, yeah, caught on. I mean, it is on its I think we have a hamburger navigation for our responsive website at fun size, just like people are. So there was something that you said a while back. Rick, when you when you have an extra light. Is that What would you call that? The quarter pounder. A big Mac. A big man. Tired of like every up we did. Having like three lines is like the standard navigation. Like I'm putting forth one call this a big bag?

16:53

Yeah. Yeah, and I mean,

16:55

I forgot about you should test that you aced it, make one of the lines squiggly like, Looked like a piece of lettuce.

17:1

Yeah, I Yeah, I mean, we're testing ours. I mean for sure. Especially after watching that and getting some of our A B testing in place. There's a nice solutions now, too. For mobile, specifically, that you can utilize at that end up working out pretty. All right. We're testing a few of them. Pick one soon. Oh, yeah. Yeah,

17:21

man. Did you feel like I mean, is it ever like it seems like generally just from what you just said about the way home a way runs it and uses user testing results? Is it basically, like, pretty conclusive? Most the time. Like they like a or A B works way. Better be works for conversions or has ever like, do you? This is 50 50 50 And then

17:42

how do you make that choice? Uh, yeah, it's It doesn't always end up being conclusive. Uh, some cases, Yeah. I mean, you run the tests, you get pretty high significance pretty high p value, and you're done. I mean, that's that's, you know that that's going to be it. And you run with it. Some tests are 50 50 and that's when you have to start testing variations and start sussing out through your data like what is actually happening. So,

for example, we were doing some variations on the travel sites we were doing like some booking, conversion testing. And when you look at booking conversion testing, you're looking at things like our book and conversion rate. You're also looking at inquiry conversion rate because for our sights, you know, enquiries, air really important. That's essentially how you start a conversation with the supplier or customer, really, the person who owns the property. So it's really important for us to understand, like, while are they doing the inquiry part of it and the booking conversion part of it, more more bookings or just converting better but the same amount of bookings, Or how does that work? So when you get, like a 50 50 split tests that the next step is essentially looking that data to see what

18:47

change still further down a and fried right? And take a look since

18:51

you gotta dig further a little bit and then it's also helpful to get some more qualitative feedback. So then you start putting the interface instead of in front of people and saying, OK, well, what do you like about this one? Would you like that about that one? Yeah, pick which ones are our which good qualities and bad qualities you would want to either eliminate or keep and then feed that into your next test on dhe. Test

19:12

that again. So at home, away they have ah ah, big audience and then a big user base already and home we can benefit from, like, taken taking a look a like bigger numbers of these things that can help you make more intelligent decisions. But for those that are just kind of trying to get something started, don't have the benefit of the big audience. Like, say, I'll I don't know of a product manager at some place around here. Wanted to start their own thing with an engineer or something. Work with a really cool studio here in Austin, Texas, to build something like that. Where would they start with a user testing to make some decisions

19:49

about the experience? Yeah. I mean, I think you in either case, I think you still have to have a pretty good idea of strategically where you wanna go and what you're trying to elicit out of users, and then you you know, when you have, like a smaller audience, you consort of keep that in mind when you're putting it in front of users And yet I think in that particular scenario, you have to be pretty selective about who you're actually going to talk to about it. Do they actually MIT meet the sort of profile that you have in your mind about the ideal user. So,

20:24

yeah, I think that and that that sort of leads me. Thio another question that I have for you about user testing. And that is like, can it goes that the rule of the squeaky wheel gets the gets the grease right? Because your power users will be the ones that reach out to you like, Hey, this, like, little thing doesn't work and stuff like that, And they'll actually come on, volunteer this information to you. But it doesn't mean that that is really what the majority of your users are going. It also doesn't mean that they know the vision of the product, right? You know, I've I've done that before too.

I'm guilty of that, like Sam, using a c. R m. And I'm looking for a way to keep in track with contacts and deal flow in all that reach out to the this one company. And I like how you know you're really missing these features, but that's not the product we're building, right? Problem was, I didn't understand it. It's kind of their fault for not clearly explaining that somehow within the app or the experience, you know? But yeah. I mean, I'm sure that, you know,

for some products that happens quite often for home away. Maybe not so much, because clearly, right? Yeah, it's

21:32

more clear than others. You know, I think one really good resource for this particular type of discussion is the lean startup by Eric Reese. You guys read that?

21:42

I have conversations

21:45

about it all. Yeah, eso won. There's a couple of things to talk about. One of conversion, maybe testing, which is a big part of this Seacon iterated Change direction if you need to. But the other thing is really there's another part about cohort analysis. So if you're thinking about what a cohort analysis is, basically just trying to do is trying to create a profile for the different user types that you have and the different engagement levels of those users. So somebody might have downloaded the app, but did they sign up for an account? Somebody that signed up for an account. Well, did they actually log in? Did a long in once a year, once a month, once a week, once a day. So you can kind of understand who you're who the people are that you're

22:25

joining, sir. Do you, like, make user stories And then sort of, like this is, you know, we gotta do this for Karen or something, you know, like, basically personified them. I've seen companies do that before, you know? Yeah, it's a stay at home mom she uses for this. And I

22:39

think I think if things there are small and you have a pretty good sense of at least the vision of that person who that person is, that's a good starting point. Um and sure liking assign names to it and personas and that kind of stuff. It's helpful to really understand which ones are your customer types, obviously, because then you can start, you know, understanding who those users are.

23:3

Especially you got a team of people working on the same problem, and you have to have that common area understanding of

23:8

the problem, right? But you have to be. I think the one thing with that is that you have to be pretty careful about designing for those people or your personas that you've created for those types of people. Because if you have, like 10 different types of users that you're trying to serve, Ah, it's a lot. And what can end up happening in that scenario, even though they're all different, you could end up fracturing your product to the point where it's unusable for anybody. Yeah, right. And that's kind of the risky run with those personas. You know, even though you have like 15 or 20 of them. That's good that you did your research and you understand little different flavors that can happen.

But you shouldn't be ripping your product apart in order to make sure that, okay, Karen, John, Mike, Susie and James were all happy. Yeah, because inevitably, like your product has to do a few things really, really well, if it doesn't do those things really well, you're you're done like there's there's nothing else talk about after that. That's

24:11

a really interesting stop topic to, especially since you know you're coming from a product company lens that were in the service aside back in the day, you know, say, Let's say early two thousands made two thousand's agencies used to go deep in discovery like the process of creating personas would take weeks right and cost lots of money, right? Yeah, I think there's been a big at least in my circles of movement, away from that right. Begin in more about getting some stuff done, testing it instead of assuming, but also like all those heavy deliver balls and all that stuff, you know? Yeah. Doesn't always work. No, it doesn't.

You know, I think, you know, maybe we're a little bit different because we're used to working with product companies. But I still think there's a lot of designers that either intercompany brought a company. Or maybe they're working with the product company as a client, and they they start going to that path, you know, And so that's that really leads me to my my main question because use a product manager at home way. You're the one that's involved in all this, right? And you're the owner of a lot of these, a lot of thinking and from all these different angles and decision making, right when you're working with a freelancer, a vendor or something like that. I say a company like fun size.

We're, you know, we're designers to We care about the same thing and we're coming from our own lens. How do you like? Whose responsibility is it to test, right? I mean, is it our responsibility to test our own work before we showed it to you? Kind of like the example that I was saying earlier. Or do we just partner really closely with the product managers that we're working with? Assuming that they've already done that? Yeah, I think it

25:47

has to be highly collaborative in that particular case, like, you know, vendors that we worked with in the past. We kind of made this mistake sometimes of like, Oh, you know what's holding it? Sort of arm's length. We know everything. You know. It'll be fine on were the expertise in that area. And it'll work. Okay, you'll get a good portion of it, right? But if you really want to try toe, do something properly,

it I think you do have to involve the stakeholders in that testing in iterating process because ultimately, whether you're starting small or your big like us, you probably are going to have to change at some point or another you're probably have to make could be a small change. It could be like a huge diverting change. Things are gonna change, and you're gonna have to do that pretty quickly. So you write like that, that's fair. You write like a 10 page back or whatever. That might be right for the time being. But if you want to go get more people or you got your business model changes or whatever, all of that becomes invalid or maybe 30% of it stays, or whatever it is she had to be able to change. So, you know, in that scenario,

when you're working with an external vendor in particular to if things change like you have change pretty fast, right and you're gonna have to make sure that everybody on board understands what that change is, right? And it's really tough. Thio do that as just being the only expert in the room, you know, because then you miss you miss all of this sort of subtleties. You miss all the conversations. I mean, they're gonna be in areas where you just can't do that, like you gotta you know, you got deadlines to get a hit or you gotta deliver. Well, you gotta make whatever. But trying to keep everybody involved, internal or external is,

I think, pretty important to making a good product that people are really going to enjoy. So it's something that you just kind of have to. It's a pain in the ass, but

27:42

it's got a hell of a time and time again. It's like, come down to the basically like, ah ha moment of Oh, if we're just transparent and respect what each other did, each other d'oh usually yields a better result. A better product. Well, and then then

28:1

that. You know, I I think I feel like that kind of starts beginning, right? You start with hiring people that you trust and respect and like, you know, they're gonna do really good work. Um, and so if you start there, then it's a lot easier to bring them in the process because you're like, Oh, well, I know these guys are going to a really great job, bring them in and start tearing itself. But if you don't pick somebody that you can trust or you're just trying to get something knocked out of it or whatever. Then you end up in this really sort of weird no man's land off like, Oh, well, you know, I'll just tell him what I think I know. And that information always ends up being incomplete and interesting.

28:42

Yeah, I think that that's probably a good way of putting it. And that pretty much rounds us out. Well, really. Oh, man. Way blasted through that man. Did you have more? How

28:53

do you know how I could talk for days

28:56

if you want. Yeah, This was gonna be a six hour talk about smooth jazz. Oh, yeah? Yeah. Ken Burns. I will, uh I will ask you a question that piggybacks off what you were just talking about. You know, this may be slightly unrelated, but I think it's really important. This is how product companies work. And if you've worked on when you know that this this is how it happens, you know, like, things can change,

uh, an hour a day, in a day or an hour or a week or a month. And for a lot of designers who have worked at agencies, that is scary, right? Right. Because If you're a designer that is working in a product design company for the first time in every, you know, every three months someone's ripping apart your designer, you're changing it. You can feel like there's something wrong with your work. You shouldn't feel that way because you were evolving it right? That's one thing. But the others. The other side is the service aside.

If things were changing someone rapidly and you are testing and you are learning and you are iterating, how does an agency in these old school models of fixed price fixed the liberals to create a product that is actually usable to you? I mean, we don't do that. It fun size. I mean, I know the answer from my perspective, but I wanna I'm kind of curious of, like how How you perceive that. Yeah,

30:22

I think it's kind of a two fold in my head's really good question. I think in my head it's twofold. It's one understanding that things were gonna change, like that's just the way that it's not even just products. It's life, man like things. You're just gonna change and you kind of have to be able to roll with the punches, like this idea that you know a 10 page deliver herbal with, like, a full, you know, mock, you know, mock ups said. And Style guide. No less weird stuff, you know, it's gonna change. Sorry.

30:53

Like that's what it is. It's like designers air compelled to want tohave these things written in stone. You know, like the style guide and just have all that like it's how we know when we're done almost, you know, And, uh, it's just not realistically, you're not gonna think of every possible scenario in one you know, that's why it's important to just jittery and plan to jittery.

31:20

And I think that the other half of that. That answer, though, is also on the party that's asking for the delivery ble the party that's asking for the delivery ble should understand that they should be getting undeliverable. That changes with them. So if things change, you have a framework under which you can change also. And so if you have that framework is part of your delivery ble than your then you know things change or whatever, then you can use that framework to change things. But if you don't have that kind of like, you know, bootstrap is pretty popular thing on the web. And, you know, we're kind of getting more towards the sort of mobile idea of bootstrap too. Um, if you have this idea of a boot strafing you're designing for that for that sort of paradigm,

you could make those changes a lot faster. And you can change with the environment and change with the business a lot quicker and provide a lot more value. And so it's kind of the onus is on both parties. I think to think that way to think that. Okay, well, here's the screen that we need the most help with. But if things change, we've got this set of you I assets and elements that we can use to change if we need to know. And, you know, if the other parties around in a consultative capacity, that would be awesome to. But, you know, you know you don't have the money. At least you have that, and that makes it a lot easier. Absolutely. And cool

32:46

cools. Yeah, thanks for That's a good question. And thanks for throwing that in. Yeah, it's like this guy's thinking over here? Um, a J How can people follow you? Should they Twitter

32:59

or Yeah, sure. You can follow me on Twitter. Uh, my handle on Twitter is my full name. Jaywalk. Gray A j A y w a g h e r a y. I've also been posting on medium

33:11

recently. Yeah, some good stuff. Even about product management on medium. Yeah, that'll

33:16

be cool. Yeah, it's That's a pretty decent follow. I'll be trying right on that account. Probably at least once a week or so. I

33:25

once a week. Wow, that he asked. You heard it here. I'm trying.

33:31

I'm trying. I'm trying. But I'm, like, not mean the nice thing about medium. We're talking about this earlier. The nice thing my medium is that I want to write a novel every time. Right? So it's a nice digestible

33:40

right? That's so I liked it when I do that, do you write him? That's how I like to ride him. And that's how honestly, I'd like to read him anyway.

33:47

Yeah, little chunks, like the short articles for sure. Uh,

33:53

around town. So you could follow me in person 63. Hey, you're that guy Awesome. In OK, So guys request topics to this podcast or just respond to this episode on Twitter. You can tweet to at fun size rate. That's a podcast on iTunes, and please do subscribe. That's it for now. Thanks very much, guys. Today's episode is sponsored by bench dot Co. Let's face it, bookkeeping is never fun, but it's something you can't escape. Benches.

The online bookkeeping service that does your bookkeeping for you When you sign up the bench, you're paired with your own dedicated human bookkeeper, and you collaborate with the bookkeeper using the bench app. It's everything you need to cross bookkeeping off your list forever. If bookkeeping is taking too much of your time and you just want done, check out www dot bench dot c Oh, that's bench that coat. They've got you covered. Hustle is brought to you by Fun Sized, a digital product design agency in Austin, Texas, and creates delightful, innovative products for mobile web and beyond.

powered by SmashNotes