Transcript 0:00 [upbeat music] Welcome back to Tasteland. I am your co-host, Francis Seir. And I'm Daisy Alioto. And we should address the music you would have just heard. Um, it... If you're a longtime listener, 0:18 it changed, uh, I think like three episodes ago. Mm. Um, we got priced out of- Yeah... our lender network. As it turns out, we don't have MJ Lenderman money- Mm... post MJ Lenderman, uh, fame. 0:32 So- Our new song, arguably, arguably better, some would say, uh- Um... it is Nightlife- It's different... by... It's different. Uh, it is... This is... 0:41 Don't, don't, don't insult my friend, Nikiso, um, who- I'm not insulting him. I think it's... I think... I'm not saying it's better or worse. I, I actually really like it. And I'll tell you- It's good. No... 0:53 why in a second. Okay. Anyways, um, Nightlife by Chillers, which is the project of my friend, Nikiso Peralta. Um, so thank you Nikiso for- Yeah... 1:03 making such a great song We're putting you on something new and psychedelic, and it has resonance with the Nightlife review, which is one of Dirt's- Mm... pop-up franchises that we do. 1:11 And that really endeared me to this song. And our first episode was with Josh of Night Gallery, the first episode we used it. Yeah. That's also true. So- Yeah... I've been listening to it. 1:22 I added it to my Spotify favorites. Me too. It's a great song. It's great. Mm-hmm. Uh, you can listen to the full thing on Spotify or Apple Music or wherever- Wherever books are sold... you get your music. Yeah. 1:32 Well, we will be talking about books and music on this episode because we are talking to my friend, Jad Asper, and, um, I actually had to check whether I... 1:43 You know, with Francis, whether I should disclose that Jad is actually an angel investor in Dirt. He was one of our earliest supporters. 1:49 Um, he's also the co-founder and CEO of Kudos, and he's affiliated with the Berkman Klein Center for Internet and Society at Harvard- And-... which I did not say with a Boston accent. Yeah. You... 2:02 The Bo- [laughs] The, the Boston accent really came out for one word. College. College. [laughs] It was bad. I, I listened to it back and I- [laughs] I, I don't have an excuse. What can I say? 2:17 There, there's no need for an excuse. You're, you're an asshole. Um- You know who else is an asshole? Tom Brady. Jeremy Strong. Oh, true. That was... Yeah. That was, that was one memorable moment. I did... 2:30 You know, I don't, I don't watch football but I, I did... I, I, you know, I'll do the Super Bowl- I don't watch football. I'm not like other guys. Look, no, like, yeah, I watch European f- um... 2:39 Anyways, no, we don't need to get annoying there. Mm. But I did, I did watch it. It was, it was enjoyable. Exciting game. Um, I feel like I've never been... I don't wa- I, I don't wanna... 2:47 This is like a dead horse I don't wanna beat, but I've never been less interested in the ads. But more than, like, like recent years or whatever you wanna, 2:56 however you wanna frame it there, I was thinking how, like maybe I'm just... Like 10 years ago, when I was like, 3:02 you know, a bright-eyed, bushy-tailed college student who, like, wanted to, like, analyze ads because I was... That's what I did in class or whatever, um, I was more interested. 3:14 Maybe that's part of it, but I think now it's just there's more constant ads, and, like, entertainment is advertising and it's, it's all too blurry. 3:22 So, like, the f- like, it's just the Super Bowl ad as a concept is all the less interesting because we're just aded out. You've also seen everything. True. Um, 3:35 Ben worked on an email for one of the accounts at his agency. That's not his primary account, but he helped them out- Mm... like one time, and they had a- Like a pitch... Super Bowl ad. 3:44 So he was like, "Am I allowed to say ad one?" I'm like, "Yeah, you're, you're [laughs] you're clearly an MVP." Um- You rearranged some punctuation. So on... Oh, so the Jeremy Strong thing. Okay. So he- Yeah, Dunkin... 3:56 was in a Dunkin' Donuts ad. So I saw a, a snippet- Isn't it just Dunkin now? Question. Actual question. Um, well, I don't know. Anyways, okay, sorry. Anyways. My Constantinople. Um, 4:10 so okay, so I saw a clip from an interview that he did where he talks about, um, turning his father's, uh, Dunkin order into a one if land, one if by land, two if by sea reference. 4:22 And, um, similar to my epigenetic- 'Cause it was one if by cream, two if by sugar. Was it? Something like that. Yeah. 4:28 Like similar to my epigenetic trauma from the big dig, that did activate something very deep in my brain. Um, we also used to on Patriot's Day, my mom would make us drive around and roll- On what?... down the windows. 4:42 Patriot Day. What's that? Um, Patriot Day is, I believe, the- Is it related to the football team? No. Well, everything's related to the football team. [laughs] Commemorates the day of Paul Revere's ride. 4:55 My mom would make us drive around and roll down the windows and yell, "The British are coming," out the windows. That's kinda fun. That- Just-... you know, just excited my deep love of history. Yeah. 5:06 Now is that- And making a public sp- spectacle... the Massachusetts state holiday? Yeah. Mm-hmm. Okay. Did you have a favorite Super Bowl ad or was it just all- No... a total wash? No, no. No. I... Not at all. 5:18 Th- you know, the one... You know what was memorable? Um, I had to look up what the term pick six means, but when, uh... I don't even know. 5:27 Cooper Dejean, Cooper Dejean, whatever his name is, um, that was, that was beau- that was my, [laughs] that was my Super Bowl ad. This is, this is 30. This is 30, is not having a favorite 5:39 Super Bowl ad but having a favorite Super Bowl play. Much more, much more interesting. Well, okay, speaking of Dunkin' Donuts, a consumer experience that I had, uh, recently is I went to Walmart and- Mm... 5:54 I saw the Native, uh, Dunkin' Donuts collaboration aisle. Did you buy it? You did buy it. I literally gasped out loud when I saw it. I did a [gasps].And I bought the Boston Cream body wash. 6:09 We're gonna leave that there because Jad is here. So you can live with that. I can live with that. Hello, Jad. For sure. Let him in. [upbeat music] Thank you for joining us. Yeah, no, thanks for having me. 6:25 Sorry I'm a couple minutes late. Oh, no, not at all. Um, Daisy was just telling me about how she bought Dunkin' Donuts native body wash at Walmart, so... Yep. I did that, and I smell like a donut right now. 6:39 [laughs] So I have no regrets. Does it smell-- Does this... Is it donut sm- smell? [chuckles] I mean, it smells about, like, what you would expect. Like- Sugar? 6:47 I mean, is it, like, really sweet or like what's the- Yeah, I mean, it's a Boston cream, but it kinda just smells like the filling. Mm-hmm. You know what I mean? 6:53 I feel like they didn't fully capture the fact that there's, like, a little chocolate involved. Um, it was that or, like, strawberry sprinkles, but as we know, I'm from Massachusetts, so- [chuckles]... 7:04 I had to go with my roots there. Their Super Bowl ad was, uh, was aggressive and interesting. I don't know if you guys saw it. That's what we were talking about. Yeah. Yeah. It was one of the more memorable ones. 7:14 A giant restaurant. Yeah. Yeah. No, I loved it. I loved the, like, little Starbucks, like, skirmish going on. Um- Jad, how much time... 7:22 I know that you're affiliated with Harvard, but how much time have you spent, like, did you ever s- live in Cambridge? I did. I was there for three years, yeah. Okay. Yeah, yeah. 7:31 So you know your way around a Dunkin' Donuts. Yeah, I do. Um- [laughs] I do, yeah. [laughs] Um, I think I've retired from, like, Dunkin' Donuts coffee days, but, um- Mm... but yeah, my, my student days definitely, yeah. 7:45 I mean, I did-- I was looking at your website where you have, like, your top three coffee shops in, like, 10 different cities listed, so you know, more, you're more Devocion than, than Dunkin'. Yes. 7:56 I think these days, for sure. [chuckles] Yeah, I try to be. It's, uh... Yeah, I'm cultivating my coffee tastes still, but yeah, trying. 8:02 I just redid my personal website, and now I feel like I have to redo it again because I didn't even think- To-... of putting my top 10 coffee shops. 8:10 [laughs] Um, well, the big joke about Dunkin' Donuts is it, like, comes out every different every time. Like, there's no consistency- Yeah... in the quality of the product that you're getting. 8:20 But, you know, we all are just rats in a Skinner box. And I think now that everything in society is gambling, maybe that's part of the appeal for people. Totally. You ne- you never know what you're getting. Yeah. Never. 8:32 Exactly. Surprise every time. Um, okay, so I do wanna jump right into it now. Yeah, please. 8:36 So, uh, to quote your profile on the Berkman Klein Center website, uh, it says that your research focuses on the evolution of internet platforms and inverting the internet's personal data model to place individuals in control of their digital identities, which will be, uh, probably the major theme, um, of this conversation. 8:56 Wow. But can you, like... From the point of, like, what do you mean by placing, um, personal, the person- like the individual data model as- Yeah... 9:06 opposed to, like, the aggregate and putting it- Yeah, Jad, what do you- Yeah... what do you mean by that? What do you mean by that? [laughs] What does that mean? Yeah. 9:11 [laughs] I guess, like, you can frame it in two ways. So one is, like, if we think about the internet today, it's a place where we take our data and we give it to lots of apps. 9:19 Um, and an inversion of that would be that app services agents come to us and our data. 9:25 Um, you know, another way of framing it is, like, we go to lots of different apps and sign their privacy policy, and an inversion of that would be that app services agents come to us and sign our own personal privacy policy. 9:38 Um, and so that's, that's essentially what I mean by inverting the internet's personal data model. 9:43 It's, uh, it's sort of a concept of just the individual being, uh, the custodian, um, of their data, but also sort of like, um, being the one that's sort of establishing the grounds of their existence online, um, and that things are adapting around them, their preferences, their needs, um, and, uh, and yeah, we are kind of back to being in the middle of our existence digitally. 10:10 Um- Yeah. Yeah, that's sort of- This is like... Oh, go ahead, Daisy. Well, I was gonna say, one of the things that rankles me about the conversation around AI is people who are really 10:20 interested in taking an academic approach to artificial intelligence and the degree to which it can mimic human behavior will talk about having, like, an AI bill of rights, like, for the rights of the AI. 10:35 And I'm like- Yeah... we don't even have a consumer bill of rights- Mm-hmm... for people that use the internet. Um, you know, it's a really nice thought, um- Yeah... that we should be, be kind to our 10:47 AI, uh, companions, but, like, we're not even kind to each other. [chuckles] Yeah. Totally. Totally. 10:53 Well, okay, so my first thought when I read this, and, like, the more I read of s- of some of your writing preparing for this- Mm... 11:00 uh, is that, like, I love this idea of, like, centering the individual and individual data rights. Um- Yeah... and it's like, it's so radical because I- Mm... don't really see how it's possible. Like, there's- Mm... 11:11 so many tubes of toothpaste have been- Mm-hmm... spilled that we can't really put back in, right? Like, I probably gave away my data, like, 100 times this morning, you know, to new, to new websites and old. 11:24 New apps and old. 200 for me. Yeah. Okay. Well, not, not a competition. I wake up earlier than Francis. It is. [chuckles] But, um, but, like, yeah, how is... 11:32 Like, this, this is a great idea, and it's, like, this ideal that I feel like is so pro-human, and, like- Yeah... you know, support it. How, how do we actually make this happen? 11:44 Like, how do we even start to think about making this happen? Yeah. No, I mean, totally. 11:49 It's, it's very idealistic and, you know, so idealistic that it's like, oh, is this even possible, is sort of a very, very valid follow-up question. 11:56 And I think, um, you know, I think the first question is, well, how do we get people to reaggregate their data in one place? Because you gave it out 100 times this morning- Mm-hmm... um, or 200 times if you're Daisy. 12:06 Um- [chuckles] And so, like, how do we get all of our data in one place?And that place being a place that you have control over. 12:13 And so that then leads to the question of like what are the incentives or user stories that will get people to re-aggregate their data in one place? 12:22 And the answer isn't people caring about privacy because unfortunately most people don't care about privacy. 12:29 Um, and so we need to lead with sort of a utility or fun or something where the outcome is privacy or agency or control. And so that's one thing. 12:40 I think the other big thing is, you know, a lot of people have tried thinking about like, okay, well, one way of getting my stuff in one place is by building a second brain or- Mm... 12:50 tools for thought type stuff which, you know, is, is, is really cool for like all the nerds out there like myself. 12:56 Um, but most people aren't thinking about I need a second brain, I need like this tool to help me remember, you know, all my conversations. Um, [lip smacks] 13:07 so many of those ideas, you know, the Obsidians of the world do exist and they do a great job at like retaining, you know, a lot of your information in, in, in a place that you have control over, but they're not really like for the masses. 13:18 Um, they're not going to spread like wildfire. They're kind of arcane. They are. They're... Yeah. They're, they're for the nerds which is, which is fine. Um- Is that what you use? 13:27 Like wh- how do you personally control your data? Um, I don't use Obsidian. I'm too lazy. Um, I, [chuckles] I just use Notion. Um, and I honestly I don't control my data too much yet. 13:38 And, um, so, so those are, so those are two things that I think have been blockers are like, well, what sort of a mass like utility or fun that we can lead with where 13:50 the outcome of it is, well, all my stuff is now in one place and I have control over it. Um, and if we get there, which is an if, if we get there- Big if... 14:01 um, then we have enough data gravity, and that's a concept I really like. 14:07 We have enough data gravity where we've started to kind of invert the Internet's personal data model where now app services agents want to come to you and all of your stuff pulled through your gravity, um, to your stuff. 14:22 And at that point then we can talk about, well, users can like... You know, what are the design patterns for people managing access to their data? 14:31 How do we move data to these other apps or services, um, in a way where, you know, we can make that other service adapt to you or do things for you? Um, and so that-- the luxury of that conversation comes- Yeah... 14:44 once we've solved the problem of putting all your data in one place. 'Cause there's almost like this radical 14:50 like rejiggering of what software development is too if it's like if the software has to plug into like, you know- Mm... 300 million different like privacy contracts, one for each individual that are slightly different. 15:04 I'm su- obviously, like in this world they'd probably be a little more uniform than, um- Yeah... wouldn't be 300 different ones for 300 people. Um, but like that's kind of... It kind of is this total disruption of like 15:20 the entire [chuckles] software business model, right? Yeah. I mean- Well, last year... Oh, sorry. Go ahead. No, no. You, you go first. Well, last year we talked about bespoke software and I feel like- Mm... 15:30 um, that's a conversation that sort of like sur- surfaced on my Twitter. 15:35 Um, but at least in my world, and probably maybe this is different in your world because of who you are sharing ideas with, but you don't hear as much about bespoke privacy. No, I think, um, 15:49 to kind of address both of those points, I mean, I think what you described was sort of this idea of data going to code, whereas what we are describing is code coming to data or adapting to data. Mm. 16:01 Um, and I'm not entirely sure what bespoke privacy means, so maybe you'll have to explain that to me. Um, but the idea- Me either. Chad, me either. [laughs] But the idea of like- Sounds nice though, right? Sounds nice. 16:13 It does. It sounds- Yeah... yeah, it sounds bespoke. I love it. 16:16 Um, but um, but yeah, the idea of like an individual being able to like, you know, through some like, you know, rule set or concierge service, like set rules around what access a service can have so it can do what it needs to do without overexposing, you know, too much about myself and... 16:36 Now those types of like questions of what, what, what does that design pattern look like is unknown. Like, no one's gonna be sitting there with a dashboard being like, "Yes, you know, share this with, you know- Mm... 16:48 Dart- Dunkin' Donuts," um, and, "Don't share this with Dunkin' Donuts." 16:51 It'll sort of have to happen, you know, in the background with some view of like what's happening and I can go in and like adapt and adjust and control and, um, yeah, that's been sort of a big topic of like research that I've been really interested in- Mm... 17:07 more from like a design perspective, and it's more forward looking. Like we don't have the luxury of thinking about that yet because we're not quite there. Yeah. But, um- I'm, I'm thinking of this- Yeah... 17:15 tweet I saw just the other day. Uh, you know Nikita Beer, the app developer? Yeah. He's big on Twitter. Mm-hmm. He had some tweet about like, you know, the audience was, was other developers. 17:25 He's like, um, "Any user who has turned off push notifications is dead to you." Like they- Mm... they, they don't, they shouldn't exist. They, they aren't in your priorities. Th- you've already lost them. 17:36 Um, which I thought was funny as somebody who has turned off push notifications for pretty much everything except for, uh, messages and, and phone calls, right? And even then most like most work days I've... 17:46 I'm on do not disturb. Um, which is like, like do not disturb is almost... Do not disturb and like customizing your push notifications on your phone is maybe one ver- like a rudimentary version of bespoke privacy, right? 18:00 Mm-hmm. Like that is you- Mm-hmm... having some control. 18:03 And then I'm thinking of, uh, was it two years ago now, uh, now that like Apple, um, made it so like you were auto-- The, the death of the cookie, whatever it was, Apple privacy where you're not automatically opted into, to data sharing. 18:15 Um-These are these kind of versions of it that like aren't a total flip and disruption to the model, um- Yeah... but or like g- uh GDPR, C- CCPA or whatever it is, the California one too- Yeah... like these- Yeah... 18:28 the you know, it kind of is happening, but it's these piecemeal things and still the data flows from us- Yeah... to companies and to aggregators like constantly. Yeah. But there are these like small steps. 100%. 18:44 I think you brought up Apple which I think is probably the biggest player that's inverting the Internet's personal data model today. 18:50 'Cause if you think about kind of on device storage, like your iPhone right now is a personal data store. 18:57 You have a lot of stuff on there, and the design patterns around when you download an app and it requests access to photos and you're like, "Yep, share these photos, but don't share these," or, "Share these contacts and don't share these 'cause now we have control over what contacts we're sharing," that is a version of my data being on device and me provisioning access to that data at will to apps, services that come to me and my device. 19:22 And so I think in many ways Apple is slowly unseating, you know, the big powerful data aggregators and giving users control over access to their data, but then it's Apple that's sort of the intermediary. Yeah. 19:38 Do we want Apple to have more control? [laughs] Like, uh, probably not. Um, and so um, you know, I think you bring up a really good point. 19:46 I think Apple is starting to establish privacy design patterns that are intuitive, um, and that's really exciting 'cause then they can be replicated. So okay, you use the term data gravity, and maybe- Yeah... 19:59 I'm misinterpreting what that means, but I'm curious like when, when the gravity of data started to flow into these aggregators, like when would you say- Yeah... is it like 20 years ago? What are kind of... 20:11 I- I'm assuming that you know the history of this- Mm... more than I do. Um, may- maybe, maybe not, but like when, when would you mark as like a watershed moment of like when the data floodgates really start- Mm... 20:22 to open and aggregation really scales up? 'Cause I'm assuming- Yeah... it's like 20, 25 years ago around the time that like social media apps, um, come into play. Yeah. 20:32 I don't know if there was like a specific moment in time. It's sort of like, you know, all like, um, what's the term? Like, a little bit and then all at once. Mm. Um- Mm-hmm... but it's, um... 20:45 Yeah, I think when, when we started seeing like massive, massive shifts in power where you can't really do anything without connecting your Google account- Mm... 20:56 or like, um, you know, if you're a small business and you want to, you, you want to understand users, well you need to connect to one of these services that people are aggregating data on. 21:08 When the balance of power really truly shifted exactly, I don't know if I can pinpoint. 21:13 But I do know that data gravity will shift again when we start seeing, you know, users being able to just show up somewhere and there being enough information for a service to adapt to you or to do things on your behalf. 21:30 Mm. And um, that's what when we know we ha- we have data gravity again as individuals. Um- I think Facebook was definitely the tipping point for me- Yeah... 21:39 and because it took things that people were doing automatically and replaced them, like knowing your friend's birthday- Mm... 21:47 um, where people are now like, "Oh, there should be an app for that so everyone doesn't have to use Facebook for it," and it's like what do you think people were doing before? 21:56 Um, and that sort of thing, um, Dirk just did a series on gamification. I think that sort of thing, uh, becomes a method of behavioral design. Yeah. 22:07 Um, I know that like Meta is still tracking things that I do around the web because I was doing a bunch of advertiser research the other day, and I reopened Instagram and it was exclusively companies that I had been googling and looking at their LinkedIn, um, in the ads, and that doesn't... 22:27 It really doesn't like phase me anymore, but there are times when, uh, it becomes a form of interpersonal behavior design. 22:36 Like a few summers ago I was at a party and I was talking to somebody else about a problem I was having with my friend, let's call her X. I had been trying to get in touch with X, X wouldn't... 22:46 X wasn't responding- Not Elon's X... and... Not Elon's X, and I, I reopened Instagram and X's most re- recent Instagram post was at the top of my feed in the sense that I feel like my phone was listening to me. 23:01 I was next to my phone when I was having this conversation. [laughs] I was saying this person's name repeatedly. They're at the top of my Instagram, and that's not that crazy. That happens to me a lot. 23:11 Actually, when I have these conversations with friends about other people, they come to the top of my Instagram feed. Um, what was crazy about it was within, [sighs] 23:22 I think, you know, the next 24 hours I got a text from X saying, "I'm really sorry I haven't responded to you," da da da da. And I suspect what happened was that I went on the top of their feed. Mm. 23:40 They felt guilty that so much time had passed and they reached out. Like there was no part of me that was like, "Wow, what a crazy cosmic coincidence." Mm-hmm. 23:50 I actually think what happened is I came up on their Instagram and they realized- Yeah... we hadn't talked, and that's, uh... It's weird to feel like you're part of a matrix in that way. Mm. I was happy, um, 24:07 but I also felt like, "Oh, how is this, how is this feed acting on me- Mm... and my interpersonal relationships?" And um, I feel that I'm in control of these dynamics, but-Maybe I'm not. [laughs] Totally. 24:25 But I think, I think you just described sort of like, you know, the structure is serving you in some way. Like, you're... You, you got value out of the ambient tracking of your stuff and it surfacing. 24:38 Um, you know, I, I don't know if that's necessarily like... 24:41 There's probably a part of it that feels weird because of a lack of transparency and a lack of control, where maybe that can be alleviated if a user had a little bit more of a view of how their stuff is being used or why something was happening. 24:55 But I think that same level of, like, spontaneity and value should continue to happen because that's, like, a beautiful moment, like what you just described. 100%. Yeah. 25:05 But if I could set my own privacy contract- Right... 25:08 if there was a clause that was like, "Would you like your phone's microphone to listen to you gossip about your other friends and then put that person at the top of your Instagram feed?" Right. 25:15 To the extent where you, like, finish the conversation, you turn around the phone and show the person that you're talking to and you're like, "Classic." 'Cause it's, it's a lack of- "Tom. No, I would not like that." 25:22 [laughs] It's a lack of agency. Yeah. Yeah. And also- You-... especially if I, that, like, shows me to them, um, 'cause there is, like, a weird reciprocity also that I think- Totally... happens in these. Um... Yeah. 25:33 You should be able to go back to the system and be like, "This was weird, system." Well, and the other thing is- Like, "Why did this happen?" And yeah... 25:39 if you always see somebody, there's so much conversation, I don't know if you've been privy to this, if this is, like, a female thing, which is, like, there's so much conversation about, like, who appears earliest- Hmm... 25:48 in your stories. Yeah. And- Hmm... part of it is like, well, if somebody appears earliest, that means they're getting served your story very- Yeah... soon after you post it. And 26:01 for a long time, it h- was like, okay, these must be people that you interact with a lot or look at your profile a lot. But sometimes there's somebody truly random in there. Hmm. And 26:10 set aside this idea that this person has, like, some sort of parasocial relationship with you that you're not aware of. Like, if you repeatedly see somebody watching your story- Hmm... 26:19 you start to believe that that person is maybe closer to you than you, you already- Yeah... knew. Like, you might- Hmm. Yeah... 26:26 start to think of them as a close friend, a visible friend, somebody that you're in touch with, when really you're not in touch with them at all. Yeah. 26:34 Um, and there, most of the conversation is around, like, oh, does this person I have a crush on or has a crush on me? Like, what does it mean? But, like, what does it mean for these weird long-term platonic- Hmm... 26:44 relationships or acquaintances where, like, I start to think, "I've really actually kept pretty good touch with this person from college." My... Meanwhile, I haven't talked to them in two years. 26:53 We just, like, watch each other's stories pretty regularly. I haven't looked at that in, in years. I actually forgot that that was a, a thing you could do. Well, I don't wanna gender this, but... 27:05 [laughs] No, I mean, I used to, like, when I was- Yeah... when I was younger and I like w- and I did wanna know, like, oh, who was, who was watching my story or whatever. Hmm. 27:12 But, like, that is something that I have unlearned, that I just don't need to do. Um, and it's... 27:19 [laughs] I, I almost think that, uh, it's, as a feature, it's kind of a harmful, antisocial feature when we bring it back up because what is it, what is it doing but giving you information to read into, uh, without really knowing the truth, and again, like, without knowing, like, how this person was, was shown your story or why they were shown it early. 27:38 Like, it... All it, all it is is, like, f- conspiracy fodder if for y- for private conspiracies about your life. Um, that's like- Private conspiracies about your life, that is a rich topic. 27:53 [laughs] I actually really love that phrasing- Hmm... because I have a lot of private conspiracies about my life. And I would say I only... If you're checking who viewed your story- Mm-hmm... 28:05 you're already concerned about something. Yeah. You might not be acknowledging it to yourself, but- Mm-hmm... there's no truly un- It's a symptom of something bad. 28:12 [laughs] There's no truly unbothered person that's looking at that. And it could be something as simple as, like, is this person mad at me? Mm-hmm. But I don't know. Jad, we- No, I mean, I think- You are the guest. 28:23 You haven't talked in like an orbit. [laughs] No, no, no, no, no. I... This is, this is fun. I wanna hear about all the private conspiracies now. Um, but no, I think the, um... 28:31 Like, people say, like, having a view of someone's For You page is like staring into their soul, right? Mm-hmm. Like, you can see- [gasps] I wanna look at mine. Wait, should we all look at ours right now?... 28:40 deep into someone's soul. Like, um, mine's gonna be mainly business-y stuff, like, kind of like work-related stuff. So you say. So you say. But let's see. Let's see. I'll pull it up. 28:49 I've taken, I've recently taken to maintaining mine more and, like, like mass selecting. Like, um, c- you c- if you hold down something and you click, like, um, Not Interested in Seeing just on the main feed- Yeah... 28:59 then you can go and you can select. So I've been trying to curate it more. That's something too that, like, I can't even remember when that became a feature, and it feels like it wasn't that long ago. 29:10 Like, the spec- the Reels-based Explore page in particular, um- Yeah... it must've been, like, four years ago now or three years ago. Maybe only two. Yeah. 29:19 I mean, there's this really big trend around being, people being able to, like, control their feed. I mean, if you look at Blue Sky now with, like- Mm-hmm... the, the marketplace of algorithms, right? 29:29 This idea that, like, I want to subscribe to the dirt feed algorithm, right? Hmm. Yeah. That sort of, like, will tailor my feed to whatever dirt things- Are you talking about Graze?... interesting. 29:42 I tried to set one up and [laughs] it was too stupid. 29:44 [laughs] Well, I did- I don't know if it- When I, when I joined Blue Sky or when you, when I followed Dirt on Blue Sky or something like that, I remember there was, like, the pa- I could follow the package of, like, dirt contributors or whatever, right? 29:53 I did make a pack- You did... but there's a way to do an algorithm too, and I couldn't [laughs] figure it out. Oh, okay. And I was like, you know what? 29:58 I think the pack is enough 'cause my algorithm would be, like, follow these people. I don't feel I need- Yeah... to set a lot of rules around it. 30:05 So, um, supporting my, my phone is listening to me story, um, my Explore page is, like, people who, who've been on Tasteled. [laughs] Nice. Um, yeah, and Kylie Jenner. Nice. 30:17 But if you wanna- I mean, there's this, this, there's- Yeah... there's this interesting concept of being able to like, can I get a high-level summary of Daisy's FYP regularly? 30:27 Like, can I get a high-level summary of, like, I don't know, someone's FYP regularly where I can, like-Can I jump into your FYP and, like, go get into your, you know, your rabbit hole? Right. Um- Use Instagram as Daisy. 30:41 Yeah, exactly. Well, I feel like that's a pretty good transition into [chuckles] you know- Yes... do you wanna do the transition, Francis? The, well, I, the one thing I was gonna say is- Basic stuff... 30:49 well, okay, I do wanna do it. You're right. Um, is that, um, no, the, the, the Explore page is, is your revealed preferences. Mm-hmm. 30:57 But then Shelf is, is your kind of le- it's less revealed preferences than it is chosen- Mm-hmm... preferences- Yeah... of, like, what you're consuming. Whereas, like- Yeah... the Explore page is the unconscious. Yeah. 31:07 Shelf is the conscious. Um, [chuckles] sorry. Thank you for giving me back that thunder, Daisy. [chuckles] Thank you, Carl Jung. Um, is it Young or Jung? I say Young. No idea. 31:19 I think it wa- You would- I think it's- It's German, right? Yeah. Carl Jung. Is that how you would say it in German? Jung. Yeah. I would say, like, Jung. Um, anyways. 31:26 It's funny, though, I, I always, I think that's right, but then when I see, like, Jungian, I wanna pronounce it Jungian. Okay. Well, it's definitely not that. [chuckles] All right. Well, I'm glad that we cleared that up. 31:38 So anyway, um- Shelf. [chuckles] Chad, tell us what Shelf is. Yeah, sure. So I, you know, I mentioned earlier- [laughs] He's so great. The look of regret on your face, I'm like, "Why did I do this podcast?" 31:54 [laughs] No, this is fun. I like this. Um, it feels like we're just hanging out, which is, which is great. Let's hope. We are. Yeah. Um, so yeah. 32:02 So Shelf, you know, I talked a little earlier about, like, well, what are the incentives or user stories that will get someone to aggregate their stuff in one place that they eventually have control over? 32:12 And, you know, Shelf was sort of the answer to that question. 32:15 We actually built like 40 things before Shelf, and most of th- those things didn't work, and they were all us, like, trying to figure out, like, how do we get people to put their stuff in one place? 32:26 And maybe- And this was all built through Kudos? Yes, this was all built through Kudos. Uh- And how much of the back end was shared? Great question. Mm. Yeah. Quite a bit was reused. Um- Okay... 32:38 but we end up, like, as we started validating things, we were rebuilding, you know, things from scratch. But, um, but yeah, the kind of like main flash 32:47 light bulb moment was essentially a metaphor, and that metaphor was the bookshelf. 32:51 Like, if you think of a bookshelf, it's like a store of all of your stuff, and, you know, there's sort of a sense of peace of looking over and being like, "Oh, my stuff is here. I know where it is." 33:00 Um, but also, like, it doubles as a display of your taste. Mm. Like, if you walked into my living room and looked at my bookshelf- A mirror... yeah, it's a mirror of myself. Shelf, self, one letter apart. 33:10 Um, and so, um, and if you picked up a, you know, picked a book off my shelf, and we had a, you, you know, we'd have a conversation about it. 33:16 And so the kind of like context around one's bookshelf became the inspiration for Shelf. And, um, and yeah. 33:24 Like, the idea is essentially it's a place for you to track everything you're consuming, what you're watching, what you're reading, what you're listening to, the games you're playing, and sort of puts it all in one place, so that you can, like, have a reference for yourself of everything you've consumed. 33:37 But also, it's sort of self-reflective. You can learn about yourself. Mm. And then you can also curate a public display of your taste. 33:44 And so you can, like, go to my shelf today, see what I'm watching, what episode I'm on of "The Black Doves," how many times I've listened to Chappell Roan this week, and you can choose to, like, get updates. 33:53 So, like, if I started a new book, you could find out, you know, what book I'm reading. If we're both playing the same game, you'd get notified, so we can, like, you know, go play the game together. 34:03 Um, and that became, like, the core utility or fun that we've led with, with the eventual goal of allowing users to aggregate all their consumption, all their ta- like, insights, interests, um, in one place. 34:17 And to get that data gravity that will, over time, allow us to kind of make Shelf more like a really sexy data manager- Mm... 34:26 so that we can get to a point where you can provision access to your data at will to all these services and, you know, have truly personalized experiences, truly agentic experiences. But that's, that's downstream. 34:39 And, um, and yeah. Like, that, that's a little bit of what, what Shelf is. 34:42 And, you know, you mentioned maybe just to end, and I'll pause, is, you know, th-this kind of core concept of, like, how curated is one's taste or identity, and how authentic should we make the experience so that you can... 34:56 You know, Daisy might say she's watching "Black Doves," but actually she's watching "Real Housewives." Um, like, should we allow Daisy to say that, um, or should, you know- To protect her s- her, her taste, as it were. 35:09 Right. Exactly. Um, and so that's been, like, a really nuanced question. I stand behind my taste. 35:13 [chuckles] Um- That's, so a couple things that reminds me of, one is, like, my first thought when I started looking into it was it's like, uh, you know, I used to do Last.fm scrobbling in college. Mm-hmm. Yeah. 35:23 And it's like that, where it's kind of aggregating multiple sources into the one. Yeah. Or also, like, early Facebook, when I remember- Mm... 35:31 when I was in, like, ninth grade making my account, and it's like, "What's your favorite book?" And you can put all those things. Um- Yeah... but then that's just part, one part of a much larger thing. 35:40 Um, and the last- Yeah... the last thing, though, kind of where you were going at the end there and, like, how, you know, s- another thing could plug into it and access this data- Mm... 35:49 that you've, that you've aggregated about yourself for yourself, is those terrible Salesforce commercials that, they were at the Super Bowl. I've been seeing them for- Mm... 35:58 a couple months now too, where it's, like, Matthew McConaughey and Woody Harrelson. Matthew McConaughey is, like, the... In one, he's at some shop in, like, some small town. Mm. 36:06 And this, the, the shopkeep is dressing him up in these, you know, this outlandish outfit, and he's like, "Oh, yeah, the, this shop isn't connected with Salesforce's AI, so they didn't know what I wanted." Mm. 36:19 But that's, like, a use case I'm getting from Shelf is, like, a, a possible future is, like, y- any, any kind of, like, 36:27 thing that, any app or product that takes in personal data to act on it and, like, give you an experience, then that's, like, your... It's like your, your data credit card that you can plug into to them. Yeah. 36:39 That's, that's a great, that, exactly right. And yeah. It's like your-Yeah, there's so many, um, analogies here, like your digital passport, your- Yeah... USB with all of your stuff that you can plug into things. 36:51 It's like exactly- Your, all the data that Google has on you, except you own it. Exactly. 36:55 It's your personal context and, um, and I think more than just the personal context, it's also, um, the access to the platforms and services. 37:04 'Cause a lot of these like, you know, agentic use cases are like, "We're gonna go order sneakers for you. Like, just tell us when you wanna order sneakers." 37:11 Well, you need to know my size, you need to know where I live, you need to know my preferences for sneakers, but you also need access to my Amazon account so you can go actually purchase the sneaker. 37:20 Like, if you tried playing around with Operator, like OpenAI's, um, uh, one of their latest releases, like they don't necessarily have access to a lot of these apps or services or platforms where you, you know, you do things. 37:34 Um, and so, you know, part of what we're doing with Shelf is we're also helping you just store their credentials, um, in one place. 37:41 'Cause when you connect to Amazon, you're storing, you know, we're storing your Amazon credentials for you. 37:45 Um, and that allows you to also not just like get personal context to like serve things on, you know, the end of the service, but allow the service to actually go do things for you, which, you know, I would describe as like, you know, even bigger of an opportunity. 38:00 Um, and so yeah, that's, uh, that's the downstream idea, but like we try not to get ahead of ourselves. 38:07 Like, you know, I think, um, it's fun to talk about that, and we, we certainly are, you know, um, researching that and engaging with that on an intellectual level and also on like a technical level, 'cause there's a lot of big open questions still around what does personal data storage architecture look like? 38:26 Um, how does one, um, transport data between two services and maintain privacy? Is it at the level of inference, or is it at the level of the data? People think it's at the level of the inference. 38:37 Um, and like trying to stay on top of things there. But what we're trying to solve for today is how do we just make Shelf part of culture? 38:43 Like, how can you have as much of yourself on your shelf, and how do we, um, just serve you and make it a really valuable utility for you? Um, and that's the kind of main thing that we've been focused on to date. 39:00 That makes sense. Okay, one, okay, one thing I wanted to say about, about Shelf and like how... 'Cause so I downloaded it this morning, and I started- Yeah... 39:07 um, you know, I'm like, okay, I put my song in, um, and then I d- I didn't get too far, though, but I was like, okay, here's how I can like track my things. But I'm thinking of- Yeah... 39:14 my own personal, like, media consumption tracking, right? Yeah. I use Letterboxd. Um, I don't use Goodreads. I have, like, an Apple note where I- Mm-hmm... write down the books I've read. Um, and then, like, 39:28 TV, [sighs] like, looking at it, I know, like, you suggest using, using Netflix to track, but that is such a fraught one, because, like, I live with my fiancée, and we both have-- we, you know, on the TV we've got our shared Netflix. 39:40 We've got, like, you know, I think we use, like, a Disney+ that's, like, my sister's and, like, all that kind of thing, and then I'm also watching... 39:48 I'm on, like, YouTube TV watching soccer games, and that's probably my blah, blah, blah. Like, long way of saying, like, it's so convoluted and so cross-platform and sometimes manual and, like, I don't use Kindle. 39:58 I don't read e-books. I, I read physical books, and I write it in a note. Um, so I think to me that's the challenge as, like, a potential user is, like, how do I... 40:08 how does this make my life better to, like, consolidate my media consumption tracking and do more manual work- Yeah... around that, right? Yeah. Like, that's the challenge I'm curious about as a user. Yeah. 40:21 No, I mean, it's, it's a, it's a big, it's a big question. I think, um, you know, the balance for us has been, like, how much do we scrubble, and then how much do we allow users to add? 40:30 'Cause there's no API to, like, the s- the movies you watch in the cinema, right? Like, there's no, like, unfortunately no API to physical books yet. 40:39 Um, and so, like, to what degree can we support users in, in, in helping with some of the tracking? Yeah. Um, and then how can they correct and fill the, you know, fill out what's missing? 40:51 And so I think getting as much coverage as possible of, like, the places where you do consume. You know, Netflix is one of many streaming platforms. Um, and so, like, can we get as much coverage as possible there? 41:04 But then also can we round the corners with, like, you being able to, um, you know, to specify, like, the profile, like your profile versus your- Yeah... partner's profile, which we do today. 41:15 Um, so you, you know, you don't have to... it's not all of your... I use, like, I think my, my sister's Netflix account, but I have my own profile. Like, I'm not, you know, tracking my sister's stuff. 41:25 Um, and then, yeah, like, you know, you might be watching... You know, we have, we have par- parents who use their Spotify account on Shelf, so it's like, you know, Peppa Pig soundtrack. 41:36 Um, and obviously that's not what they're into, so can we, can we, like, correct for that? 41:41 Um, so, so there's definitely some complexity there with, like, getting as close as possible to allowing you to passively track, um, while giving you the tool set to, you know, actively add and curate and, and do all of that. 41:56 And that really, you know, what the friction... We need to minimize the friction, and we need to maximize the value you're getting out of doing that. 42:03 And so is that value for you to learn about yourself, to have a reference? Is that value so that Daisy can keep up with you effectively? Mm-hmm. And, like, you care about letting Daisy in on what you're up to? 42:14 Um, is it so that you can- He does. He does? Okay, cool. Um- In a book clubbing type of function. Right. Yeah. Maybe. Maybe there's sort of a book clubbing type of function. 42:23 And so it's the balance of minimizing friction and maximizing incentive or utility on the other end. Yeah. Are there any numbers that you can share about how people are using Shelf or anything that surprised you? Yeah. 42:37 Um, so we have, like, one point five billion data points now, where a data point is something like Daisy watched "Severance" Season 2, Episode 1 on Monday. Um- I did do that. Not on Monday, though. 42:50 [chuckles] I was stalking you on Shelf. No, I'm joking. That would be so bad. Um [laughs], um, and yeah, I, I think that-Number is very large, um, and that just sort of speaks to how much people consume. 43:05 Like, uh, on average, I think per user we have, like, thousands and thousands of, like, rows of, like, consumption. Um, so I think just the f- sheer amount of consumption people do is, like, surprising to me. Mm-hmm. 43:19 Um, I think the other big thing is just, like, people really care about, like, proving how big of a fan they are of something or how early they are to something. Mm-hmm. 43:30 Uh, and that's been, like, a really interesting kind of, like, incentive to work with. Like- Yeah... you know, can we help users prove that they were really early to an artist or prove- It's like in a Web3 type of way. 43:44 [lip smacks] Yeah, but not, not, not in the, like, get a POAP or NFT type of thing. What's that? [chuckles] Um- Um, well, what's the most popular integration, outside integration into Shelf? 43:57 I mean, Spotify is, is by far the biggest one so far- Wow, okay... today. Yeah. 44:01 I'm thinking of, like, the w- the way I use Letterboxd or my, my notes app of books is, like, when I'm at a party and I'm talking to someone, and it's like, "What have you been, what have you been up to recently? 44:12 Like, what have you been..." And I, like, I don't remember, right? And I used to... 44:14 10 years ago, I would've remembered exactly the last three books I've read and the last 10 albums I've been obsessed with, but now I don't 'cause I don't have to. So I, like, I pull up my phone- Yeah... 44:22 and I'm like, "Well, what have I been listening to? What have I been watching?" Yeah. 44:25 I think, like, Shelf makes that more interesting because instead of pulling up Letterboxd and then notes, and et cetera, et cetera, it's like you just have it all right there, and then you're able to talk to these things, like the different themes you've been consuming- Yeah... 44:40 um, more specifically. I'm also... Another... Did either of you watch Didi, that movie? No. Um, so there's a scene where w- it's set in, like, this is like 2008 or '7 or whatever. The kid would be, like, 44:55 about 13 or something. Um, [lip smacks] and he has this crush, and he is talking to her on AIM, and he, like, goes on her, her AIM profile or her My sp- her Myspace or whatever. Like, what are her favorite movies? 45:07 'Cause she asks, like, "Oh, what movies are you into, into?" And he picks one of her favorite movies that he's never seen before, but he, uh, you know, he's like, "Oh, I, I like this movie." 45:16 Um, and I'm like t- that's one way I'm thinking of Shelf. Or even, like, in... A couple weeks ago we interviewed somebody, and I, like, was like, "Oh, what, what can I find on this person?" 45:27 She'd written about Goodreads, and I found her Goodreads, and I, like... Uh, we asked a couple questions about, like- Mm-hmm... specific books she read. Yeah. So that's like, to me, like, it becomes 45:38 not necessarily a social media in that way, but, like, a primary source for information- Totally... about a person that you can use to communicate with them- Absolutely... and, like, learn more about them going into- 45:48 Yeah... a conversation. I don't know. That is, that is one of our goals. I mean, we see people using it kind of like a Linktree for their interests. Yeah. Like, they'll link to their shelf. Oh, that's good. 45:56 And, um, you know, it's sort of a dynamic view of what they're into right now. Like, it's similar to how I have my coffee list on my website. Lots of people have a book list on their website. 46:05 Um, and so we do see, like, podcasters kind of start using it, starting to use it as a way of, like, they'll talk about a book on their podcast, and they'll link to their shelf so that their audience can go and, like, keep up with what else they're doing, where they're at in the book, et cetera. 46:20 Um, and, like, one thing I, I... Kind of a dream feature of mine is me being able to, like, tap phones with Daisy and, like, immediately see what we have in common- Mm-hmm... 46:29 where it's like, damn, you're both really into this genre, this niche genre. You both listen to this podcast, and you've, you know, recently watched this movie. Well, that allows us to- Here's how many... 46:38 You've read 300 books. She's read 700 books. Here's how many of them overlap. Like, yeah. Exactly, yeah. And, like, you can just jump straight into that, so. Mm-hmm. Do you know Winnie at Chipt? Chipt is a- Don't. 46:50 A- I don't either... acrylic nails company that puts an NFC chip into one of the nails, and you tap it to somebody's phone, and it gives you their, your social profiles. Mm-hmm. Wow. 47:00 Um, so you should do a, a ShelfxChipt integration. That's so cool. [chuckles] But, uh- That'd be so cool... 47:05 yeah, I've been, I've been, I've been watching Winnie, um, in Chip- like, from afar, being like, "Hey, this is pretty cool." Um, but, uh, I guess, like, one thing that came to mind when you were talking, Francis, um, is 47:22 before the internet, there would be, like, two phases of conversation of getting to know somebody and their, their media consumption habits, which is like, "What do you like?" is the first phase, and then- Mm-hmm... 47:33 "Why?" is the second phase. Yeah. Mm-hmm. Yeah. And now we've made all of these ways for people to signal what they like. But part of the problem is you would think that would get easier to skip to- Mm... the why. Mm. 47:47 But so much of the focus is on the curation that actually- Mm... we don't get to the why. Mm. And because it's just I have 10 different platforms to say what I like, and by the time I'm done doing that- Mm... 48:03 I'll have nothing else left to say. Like, I'm looking at your shelf right now to- Yeah... see that, okay, your top genre right now is Bedroom Soul. Um, you're listening to Lenny's podcast. Mm-hmm. Watching Reverence. 48:16 And it's like those are all then kind of data points that, like, theoretically you could, or, like, you know, an AI, whatever, could extrapolate what are the ideas behind these. 48:30 What is, what are the, you know, connected to like what are the people who made these things, their accounts on social media, what are they engaging with? 48:39 And like, you know, a million data points you could put in- Yeah... about, like, who are these people? What are their ideologies? 48:46 And then, like, how do they relate, how does, how do the, how does the, the, how do the ideologies behind your favorite musicians- Yeah... relate to your favorite shows? Yeah. 48:54 Then you start to get into some why, where it could, like- Yeah... spit out what's your worldview, but that's also antithetical to this idea- Mm... of, of data privacy and something- Totally... 49:03 I think that, um-Shouldn't happen. Like, that would be a massive invasion of privacy for on, for so many parties on so many levels. Yeah. 49:12 I think the kind of what you're referencing here is like inferences made based on the what- Yeah... should be made transparently. 49:21 Like, what I really love about, like, some of the more recent developments in, like, you know, AI bots, like DeepSeek describing why it's making, you know, the decision it's making or the judgment it's making, kind of the, 49:36 you know, the, the, the, like, stream of logic is it's, it's transparent. Like, I can see the biases more clearly, et cetera. Mm-hmm. Um, so I think there's value in that. Um, 49:49 and, uh, and I think the other cool thing is that because there's so many models, like you can, you can funnel, you know, the what into different models and see the whys, like the inferences and their differences kind of across them as well. 50:01 Um, but I think what you're hitting on is, like, and, and maybe this is what, where we can take the conversation is, like, you know, there's the flat view of, like, what you're consuming and what's missing is the context around it. 50:14 I think that's what you were getting to, Daisy, is, like, what's the context? 50:19 Um, and how do we start to sort of unravel the context so that that's part of the picture and the narrative and the story, and I don't think we do a good job of that today with Shelf. 50:30 Um, and the big part of the reason for that is we haven't cracked how to allow people to add that context in a fr- in a way that's low friction- Mm... enough, you know? Well, this is... Wait, sorry. 50:43 This is actually, this is a theme on this podcast- Yeah... is low friction culture. 50:47 [laughs] And in this, in this situation, I actually think, like, for, for an app developer, you do want to create that low friction, but for the actual function and the social connection- Yeah... 50:57 maybe the high friction is actually good, and it's in the figuring out- Agreed... and, like, having to have conversations with these people about why you like this thing and, and make those connections yourself. 51:07 Like, maybe that is, like, actually the spark, and that's what's valuable. Um, and to- Yeah... to automate that and productize that actually degrades what's, what's, like, good and magical about these conversations. 51:22 Like, the why is something you have to arrive to together in conversation with a person. I agree. 51:27 If you skip to it through a product, then, like, you're not, it, you're just drinking Soylent and not even getting any of the nutrients. Totally agree. 51:35 I think, like, the why, the high friction why allows you to then associate more meaning with the thing, right? Yeah. 51:42 'Cause, like, it, you put in time and effort to generate the insight and to put it down and to express it. 51:49 Um, but I think that associating that only with, like, really momentous things, like, if this was, like, you just discovered this artist and she made you cry, and, like, it was, like, it was a thing that's worth adding context to, then sure, that's high friction. 52:02 You know, it's probably, like, you can make, you can make, um... There's a reason to doing that. 52:08 But, you know, the, the YouTube video you just watched and, like, you know, whatever, it's, like, on, you know, it's the algorithm just surfacing things to you, um, you're probably not gonna invest a lot of time adding the context. 52:21 Like, maybe there's a reason where you, we can prompt you in a way to reflect and put something down. But the context maybe was so algorithmic too and, like, you weren't even thinking. 52:30 It was just delivered to you- Right... and it's the next thing you click on. Yeah. 52:32 So I think playing with friction is definitely a big theme for us too, is, like, what's the right level of friction to unlock meaning and, like, a sense of accomplishment, but not to burden the user. 52:44 Like, how do we balance burden with, like, sense of, like, yeah, meaning, meaning making? Um, and it's, it's a very, very hard problem, and, like, it depends on the context of the various things we're working on. 52:59 Um, so yeah. Um, yeah. To be, to be, to be unraveled. [laughs] Yeah. That's great. Yeah, I think a higher level of experience in something is not only to log it or even rate it- Yeah... 53:10 'cause there's a lot of people who use Letterboxd where, like, they'll give something a star rating or they'll give so- a book a Goodreads rating- Mm-hmm... but they won't necessarily write anything about it. Mm-hmm. 53:19 Yeah. Like, wanting, feeling compelled to write something about it is a whole other influence. Totally. Um, and- Totally... 53:27 I'm sure as you build out Shelf, like right now, like I can add as many things as I want, but only six will display at a time, right? Right. Yeah. So there is already sort of a hierarchy to it. Yeah. 53:38 And as you go along, you could add a hierarchy where things that are highly valued with meaning- Yeah... um, require additional context. But I also- Yeah... 53:47 like, by having an editorial arm of Shelf as well, and I was recently- Mm... profiled- Oh, yeah. [laughs]... um, that's also a way to add context and like meta context- Gotcha... um, for the project like as a whole. 54:00 Exactly. And we talk about that a lot on this podcast too, that that's kind of the function of media. Were there other things- Yeah... Oh, sorry, real quick. 54:07 Are there things that, like, aren't in Shelf right now that you would wanna add? Like, we were talking about YouTube for a second. 54:11 Like, is, uh, y- we're almost implying, like, do you add, like, your, your most recent YouTube video you watched, your favorite YouTube, and then it's like- Yeah... 54:18 your favorite Instagram account that, or like the Instagram account- Yeah... you've... I don't know. Th- th- so many other things you can add where it's like where do you stop- Yeah... and why? Totally, yeah. 54:28 I mean, um, the, the, the main framework that we use today is there's two main frameworks. The first is we're focused heavily on the present self- Mm... 54:36 not necessarily on all time faves, 'cause the present self evolves and becomes past. Uh, and so, like, when you go into someone's shelf, it's not necessarily their favorites. 54:45 It's, like, what they're up to or into right now. But obviously, if you want sort of a picture of someone, it's not just what they're into right now. 54:53 You also need that higher level view of, you know, what is really meaningful to them, and that's missing today from Shelf. 54:59 I think the other big framework is, like, we think in verbs, so, like, watching, reading, listening, playing are the main verbs that we support today, and watching includes YouTube.I mean, YouTube's also listening 'cause a lot of people, you know, listen to stuff on YouTube. 55:12 Um, and, you know, you can actually add YouTube stuff today on Shelf. Uh, there's no YouTube connection, but you can add things manually. Mm. 55:19 Um, and really the hope is that we can extend beyond those verbs, and we don't restrict what links you can add manually. 55:25 So we've seen people add everything from, like, the hot sauce that they're into this week to, like, you know, the mascara they bought, um, and have been wearing. Um, and so, you know, I think there's... 55:38 the concept of a shelf is extensible. Um, the concept of one's taste and what they're into, and consumption and activity is also extensible. 55:47 And for us to really get to a point where it's a holistic view of the self, for y- for Shelf to be a view of one's self, we need to expand beyond, um, the current verbs. 55:57 And so that is very much within kind of what we're trying to build. Um- I tried to pitch Jad on doing a Fragramtica integration, but I think I'm just gonna start manually adding my perfume. Yeah. 56:09 See if anyone catches on. Yeah. [laughs] Do 56:18 you use the app, though? They have an app? Yeah. [laughs] No. Yeah. 56:35 I have not used the app version. I noticed that yesterday. Oh. Love- I love Perfectly Imperfect. I think there's... 56:42 I think they've, they've really been excellent at being very highbrow in the sort of approach that they've taken and really targeted the sort of elite tastemakers, if you will. Yeah. 56:53 Um- Yeah, but if everyone's a tastemaker, then nobody is. Right. Yeah. Word. [laughs] On that note... No, I'm just kidding. 57:01 Um- No, that's, uh, that's- Francis and I are sort of in, like, an unstated competition to end the, like, end the podcast. You just lost the compe- you lost the competition by not ending it there. [laughs] No. Well, no. 57:16 [laughs] Anyway, this is Tasteland. We'll see you next week. [laughs] Peace. Thanks again for that. Thanks for having me. Bye. [outro music]