Carleton University**We aren't endorsed by this school
Course
SYSC 5807
Subject
Information Systems
Date
Jan 14, 2025
Pages
19
Uploaded by ConstableSummer21130
This paper is included in the Proceedings of the 32nd USENIX Security Symposium.August 9–11, 2023 • Anaheim, CA, USA978-1-939133-37-3Open access to the Proceedings of the 32nd USENIX Security Symposium is sponsored by USENIX.The Role of Professional Product Reviewers in Evaluating Security and PrivacyWentao Guo, Jason Walter, and Michelle L. Mazurek, University of Marylandhttps://www.usenix.org/conference/usenixsecurity23/presentation/guo-wentao
The Role of Professional Product Reviewers in Evaluating Security and PrivacyWentao Guo, Jason Walter, and Michelle L. MazurekUniversity of MarylandAbstractConsumers who use Internet-connected products are oftenexposed to security and privacy vulnerabilities that they lacktime or expertise to evaluate themselves. Can professionalproduct reviewers help by evaluating security and privacyon their behalf? We conducted 17 interviews with productreviewers about their procedures, incentives, and assumptionsregarding security and privacy. We find that reviewers havesome incentives to evaluate security and privacy, but they alsoface substantial disincentives and challenges, leading them toconsider a limited set of relevant criteria and threat models.We recommend future work to help product reviewers provideuseful advice to consumers in ways that align with reviewers’business models and incentives. These include developingusable resources and tools, as well as validating the heuristicsthey use to judge security and privacy expediently.1IntroductionMany Internet-connected devices and software have securityand privacy vulnerabilities [6,63,75,85]. This endangersconsumers, who often lack the time, expertise, or motivation toevaluate security and privacy themselves across a staggeringarray of options [66, 100, 104].Efforts, such as security and privacy labels [24,39,73],are underway to shift this burden toward professionals andinstitutions. Professional product reviewers, who publish in-formation to help consumers decide what products to use [13],represent another potential path forward. Product reviews of-ten distill the results of extended research and hands-on testingby independent experts. They influence consumers’ percep-tions and choices about products [64,109]; like entertainmentmedia [37] and VPN ads [1], they may also shape consumers’mental models of security and privacy regardless of the re-viewer’s expertise or intent. While not all product reviewerswill conduct in-depth technical analyses of security and pri-vacy, we hypothesize that with appropriate support, they arewell positioned to help consumers choose Internet-connectedproducts with better security and privacy.Some advocacy groups seek to influence product review-ers’ coverage of security and privacy. The Digital Standardis a framework for evaluating Internet-connected productsintended to guide rigorous evaluation of security and pri-vacy [86]. Civil rights groups have called on product review-ers to stop recommending Ring doorbell cameras in lightof the company’s partnerships with police departments forsurveillance [28] and are tracking which organizations havedone so [30]. However, there is currently no work systemati-cally investigating how product reviewers evaluate securityand privacy, and to what extent this role suits the businessmodels and incentives involved in their work.We fill this gap by conducting 17 interviews with profes-sional product reviewers who evaluate Internet-connecteddevices and software, to understand whether and how theyevaluate security and privacy, as well as what incentives, as-sumptions, and challenges they have. To prepare, we alsoanalyzed security and privacy content in a small sample of 71published reviews. We find that product reviewers considera variety of security and privacy criteria and threat models,but we identify areas where consumers could benefit frommore information. Reviewers use some techniques and toolsto evaluate these criteria, but they are limited in time and ex-pertise. While they have some incentives to evaluate securityand privacy, reviewers face substantial disincentives and chal-lenges that must be overcome if efforts to assist them are to besuccessful. Given limited resources, reviewers’ assumptions—about products and about their audiences—inform what theyprioritize. Based on our findings, we make recommendationsfor future research and for resource and tool development tosupport product reviewers in evaluating security and privacy.2BackgroundHere we provide background on professional product review-ers, defining the scope of this work. We then discuss existingresearch on the impact of professional product reviews, aswell as existing security and privacy resources to help con-sumers choose safe products.USENIX Association32nd USENIX Security Symposium 2563
2.1Professional product reviewersWithin the scope of professional product reviewers, we in-clude reviewers at media companies (e.g., CNET1), nonprofits(e.g., Consumer Reports), and YouTube channels2(e.g., Mar-ques Brownlee). As we are interested in shifting the burden ofevaluating security and privacy to professionals, we excludeauthors of user reviews, such as those aggregated on Amazon,and we focus on reviewers who interact with products beyondfirst impressions (as opposed to summarizing publicly avail-able information or producing unboxing videos). Reviewersmay primarily associate with domains outside of technologybut still have expertise in certain products (e.g., a journalistwho writes about parenting and also reviews baby monitors).Product reviews are funded in many ways, but affiliate mar-keting has recently become especially impactful and nearlyubiquitous [13,91,98]. Affiliate reviews typically includea purchase link identifying the product reviewer, who earnsa commission from each sale. Sponsorships are also a com-mon source of compensation for independent reviewers inparticular [4]; reviewers may be paid directly by companiesfor reviews, or they may be sent free or discounted products.By the policy of the U.S. Federal Trade Commission (FTC),reviewers must disclose certain compensation [20], althoughcompliance on social media has historically been low [65].While these business models raise legitimate concernsabout trustworthiness, some reviewers take steps to mitigatebias, such as delegating business decisions to non-review staff.However, there is evidence of crooked business practices: e.g.,some VPN review sites allegedly auction the top spot to thehighest bidder [83]. Understanding how (dis)incentives affectevaluation of security and privacy across a wider range ofInternet-connected products is an aim of this work.2.2Impact of professional product reviewsWebsite rankings indicate that many consumers consult prod-uct reviews about technology. A Tranco [57] list of the mostpopular websites between July 8, 2021, and July 7, 2022,3in-cludes at least three websites focused on technology reviewsin the top 1,000 globally, ranking CNET at 172, PCMag at645, and TechRadar at 764. Similarweb [90] estimates thatthese three sites received 52, 23, and 33 million visits permonth on average, respectively, between December 2021 andMay 2022.4Previous work suggests that professional product reviewsdo influence consumer behavior. Luo et al. found that1Examples of reviewer organizations in this paper should not be taken toimply anything about whether they were involved in interviews or not.2In classifying some YouTubers and other creators as professional, wenote that many produce video content requiring significant time and skill,and that they often share the same revenue sources as reviewers at moretraditional organizations.3This list is available at https://tranco-list.eu/list/Z2GLG4We note that the Tranco and Similarweb estimates are not consistent withone another; we provide them simply to indicate rough orders of magnitude.the sentiment and volume of technically focused “expertblogs” (prominently featuring product reviews) are correlatedwith consumer perceptions of PC brands [64]. Analyses ofdownload.com, which features CNET’s professional reviewalongside user reviews, found that higher professional ratingslead to more user reviews and downloads of software [109],and that positive (but not neutral or negative) professionalratings lead to more downloads of software free trials [58].Ramesh et al. found that 57% of surveyed VPN users had usedrecommendation websites to discover and choose among dif-ferent VPNs, and 94% of these respondents considered thesewebsites trustworthy [83].Recent lab experiments, despite limited external validity,suggest that reviews attributed to an expert professional mayhave greater impact than user reviews in certain circum-stances [49,79,80]. While some older work found that con-sumers considered reviews less useful when attributed to ex-perts [60], we hypothesize that consumers’ perceptions haveevolved with increased awareness of fraudulent user reviewsthat are paid for secretly by product sellers [42,47], writtenby online trolls for political reasons [10], or part of extor-tion schemes [70]. Indeed, two identical surveys conductedin 2011 and 2016 found that, over time, perceived sourcecredibility and factual basis became more important in con-sumers’ perceptions of review usefulness [31]. The impact ofprofessional product reviews will likely continue to evolve,as existing technology review sites grow in traffic and rev-enue [78] and more major media organizations create theirown product review operations [33].2.3Security and privacy consumer resourcesPrevious work suggests that consumers value security and pri-vacy information if it is provided when choosing an Internet-connected product, but they do not often prioritize these fac-tors in practice, especially as information is scarce. Emami-Naeini et al. found that some interviewees had consideredsecurity and privacy when purchasing an IoT device [25],while Zhang et al. found that few had considered data pri-vacy when installing mobile apps [108]; participants in bothstudies said relevant information was difficult to find. Few par-ticipants surveyed by Ho-Sam-Sooi et al. mentioned securityor privacy as a factor when deciding whether to buy a smartthermostat [43]. On the other hand, almost all participantsinterviewed by Emami-Naeini et al. also said they would paymore for a device if security and privacy information were pro-vided [25]. Supporting this, various experiments have foundthat when people are given relevant information in accessibleformats such as security labels and privacy checklists, theychoose products with better security and privacy [51] andmay be willing to pay more [43,46,99]. More specifically,Emami-Naeini et al. measured the differing impacts of indi-vidual security and privacy attributes on people’s perceptionof risk and their willingness to purchase IoT devices [23].2564 32nd USENIX Security SymposiumUSENIX Association
Some resources and tools are designed to help consumerslearn more about the security and privacy of connected prod-ucts. Mozilla’s free online guide, *Privacy Not Included, cov-ers publicly available information about connected consumerproducts [34]. Other media organizations consolidate dis-crete reviews, advice, and news into online security and pri-vacy guides for consumers [26,87]. Researchers have de-signed labels to communicate security and privacy informa-tion about privacy policies [50], Android apps [51], and Inter-net of Things (IoT) devices [24]. Similar privacy labels wererecently adopted in the Apple [7] and Google Play [35] appstores. Various countries [21,32,39,73] and private organiza-tions [19,44,95] are developing programs to provide securityand privacy information about IoT devices to consumers.Some researchers have developed partially or fully auto-mated frameworks for evaluating the security and privacy ofAndroid apps [3,85] and IoT devices [2]. They occasionallycollaborate with journalists and professional product review-ers to inform consumers [18, 38].Recent work suggests that Apple privacy labels do not yetinform consumers effectively. Li et al. reported that app de-velopers make mistakes when creating Apple privacy labelsand find the process time-consuming and overwhelming [61].In large-scale analyses, Li et al. found that developers rarelycreate or update privacy labels unless forced [62], and Koll-nig et al. found that most apps labeled as not collecting userdata actually did so (perhaps unintentionally) through third-party tracking libraries [53]. Lay iPhone users interviewed byZhang et al. found privacy labels useful but misunderstoodthem in many ways; most also had not heard of them [108].While new resources like privacy labels can empower con-sumers to make better security and privacy decisions, ongoingbarriers to effective implementation suggest that oversight bythird-party experts, such as professional product reviewers, iscomplementarily important.3Analyzing a snapshot of product reviewsTo inform our interviews, we analyzed a small sample of 71published product reviews, focusing on what security- andprivacy-relevant crieria they cover and what techniques andtools they use to evaluate them. As this was an exploratoryactivity, our findings are limited in scope and depth; however,we describe them here to give readers an impression of howsecurity and privacy are covered in some reviews.3.1Review analysis methodFor our dataset of product reviews, we focused on three com-mon IoT devices: thermostats, locks (overtly security-related),and doorbell cameras (overtly security-related, with privacycontroversies). For each, we devised one search string forlist-style reviews and one for reviews of a specific, popularproduct; e.g., for thermostats, we used “best smart thermostatsreview” and “Nest thermostat review.” Using private browsingmode, we downloaded the top five relevant results on Google,Bing, and YouTube for each search string, skipping reviewswritten by users (not professionals) or published before De-cember 2017 (we collected reviews primarily in Novemberand December 2021). Factoring in repeats across Google andBing, our dataset contains 41 text and 30 video reviews. Ta-ble A1 in Appendix A counts the reviews in our dataset bysource; we note that the text reviews are clustered in fewersources, while the video reviews are more diffuse. Videos had177,723 views on average, with a median of 88,816.To analyze the security and privacy content of these re-views, we developed an initial qualitative codebook fromother reviews of Internet-connected devices and software,also incorporating concepts from the Digital Standard [86].Two researchers refined the codebook while collaborativelycoding 21 reviews from our dataset; they then coded 9 reviewsindependently, achieving a Krippendorff’sαof 0.83 averagedacross each code that exhibited variation, which indicatesgood inter-rater reliability [55]. The two researchers then splitall remaining reviews. Our codebook is in Appendix D of thesupplementary materials.5In general, we coded security- and privacy-relevant criteriaeven if they were not presented explicitly in that context (e.g.,describing how multi-user access works for a smart lock).We reasoned that we were not equipped to judge whetherthe security and privacy implications would be apparent toconsumers—this is a question for future work.3.2Review analysis resultsTable 1 lists the most common security- and privacy-relevantcriteria included in our dataset. More complete results are inAppendix E of the supplementary materials.In our dataset, many criteria are included mainly in thecontext of a device’s functionality: e.g.,human/AI processingis usually present because we counted any mention of voicerecognition, a way to control IoT devices, andsoftware up-datesare often described as part of the setup process. Reviewsfor different product types cover different criteria correspond-ing to their core functionality: e.g.,audit log/notificationsis in most reviews of locks and doorbell cameras becausenotifying owners of people at the door is a popular feature,but it is rarely in reviews of thermostats. When mentioned atall, threat models are also typically tied to functionality (e.g.,an intruder bypassing a smart lock to break into the house),although one review mentions botnets, describing them asa “larger societal problem” causing harm to banks and otherinstitutions [81].Some criteria unrelated to core functionality arise occasion-ally, withencryptionandmulti-factor authenticationmen-tioned in 20–30% of lock and camera reviews: e.g., one re-5Supplementary materials are located at https://osf.io/m2pe7/?view_only=e6a8443956704fe2b380cfce1def1204.USENIX Association32nd USENIX Security Symposium 2565
Table 1: All criteria included in more than 10% of the reviewsfor any product type. The table lists the percent of reviewswith each criterion, per product type.Therm.LockCamera(N = 23)(N = 24)(N = 24)Human/AI processing1008388Audit log/notifications97979Limiting/controlling data handling22492Multi-user access control98312Functional bugs353829Locale of data storage/processing41283Software updates262929Can withhold action/capabilities571712Data retention0067Encryption42129Multi-factor authentication42121Password sec/priv0420Sec/priv reputation of company0038Data minimization/justification4425Other data sharing4425Sec/priv for special classes of data9817Usability/accessibility for sec/priv0812Recovery0170view lists “no end-to-end encryption” as a reason to avoid acamera [82].Security/privacy reputation of company,dataminimization/justification, andother data sharingare men-tioned in 25–40% of camera reviews. This mainly relates tocriticism of Ring cameras for sharing data with police [71]:one review questions “whether a company with both financialand operational ties to law enforcement” should be trustedwith sensitive personal data [92].Few reviews mention techniques and tools used to evaluatedevices. 56% give some indication of“living with”a devicefor an extended period of time or in a realistic environment.Fewer than 10% each report checkingcustomer feedback,sharing a device withmultiple simultaneous users,commu-nicating with the company, or readingpolicies/documents.Rarely, reviews mention challenges: one reviewer explainsthat they do not cover privacy policies or user agreementsin detail because “it’s impossible for us to read and analyzeevery single one of these agreements” [89].Throughout this analysis, we remained unsure whether thelimited discussion of security and privacy was because theseaspects were not evaluated at all or because they were a part ofthe review process that was simply not prioritized for commu-nicating to the audience. Thus, in our interviews we includeda section focusing on how product reviewers decide what tocommunicate in a review and what to leave out.4Product reviewer interviewsTo dig deeper into how and why product reviewers evalu-ate security and privacy for Internet-connected products, weconducted 17 semi-structured interviews with 18 product re-viewers over video call between February and May 2022, withthese research questions:1.What security and privacy criteria do product reviewersevaluate?2.How do incentives and assumptions influence their ap-proach?3. What techniques and tools do they use?4.What challenges do they face? What resources and toolsdo they need to be more effective?4.1Interview methodRecruitment.We recruited participants who were 18 years orolder, spoke English, and had published at least ten reviews ofInternet-connected devices and software.6We used our bestjudgment to only include reviews where the reviewer demon-strated they had interacted with products themselves beyondfirst impressions. In two instances we had two participantsfrom the same organization: P1 and P2, and P9a and P9b. P9aand P9b review as a pair and were interviewed together.To carry out recruitment, we compiled a list of eligibleproduct reviewers from media companies, nonprofits, andYouTube channels that we were familiar with. We also usedsearch engines extensively to find new reviewers and organiza-tions, focusing on three types of Internet-connected products:smart home devices (e.g., thermostats and security cameras),wearables (e.g., watches and sleep trackers), and software(e.g., tax filing programs and photo storage services). Wereached out directly to 144 individuals and organizations inthis dataset using publicly provided contact information, yield-ing 15 participants. We also recruited 2 participants via anindustry contact and 1 through snowball sampling. We con-tinued recruiting until we reached saturation [40].During recruitment, we avoided mentioning security andprivacy, in order to reduce sample bias and avoid priming;we framed the interview as a study of how reviewers eval-uate connected devices and software in general. During theinterview, we did not bring up security and privacy until afterasking participants generally what kinds of criteria they con-sider, to see whether they would be mentioned unprompted.While participants could have learned that we were securityand privacy researchers from information online, there wasno indication that any knew this other than P1 and P2, whoare in security- and privacy-focused roles and were recruitedvia our industry contact.Interview design.Our interviews were semi-structured,meaning that we broadly followed a protocol but adaptedit and asked follow-up questions as appropriate for each par-ticipant. Interviews included four main parts. First, we asked6P2 did not satisfy the publication requirement but had significant review-ing experience as a program manager.2566 32nd USENIX Security SymposiumUSENIX Association
background questions: e.g., about the participant’s businessmodel and experience. Second, we asked what criteria (firstgeneral, then security- and privacy-specific) they evaluate, aswell as security and privacy criteria they consider importantbut do not evaluate. Third, we asked about techniques andtools they use to evaluate security and privacy, as well aschallenges they had encountered. We also prompted them toimagine any hypothetical tools, resources, regulations, or in-dustry norms they would want to help them evaluate securityand privacy. Fourth, we asked participants how they communi-cate security and privacy information and how they approachnegative reviews. Our interview protocol is in Appendix B.The recorded portion lasted 48 minutes on average.At the end of the interview, participants completed a four-minute survey on their security and privacy knowledge, organi-zation, and demographics (Appendix C). Participants receiveda $50 Amazon gift certificate as compensation; three refusedcompensation due to their organizations’ policies.Analysis.We recorded audio of our interviews, which wastranscribed automatically and corrected manually. The firsttwo authors coded four transcripts collaboratively, developinga qualitative codebook from scratch guided by the researchquestions. After four interviews, the high-level structure of thecodebook was largely stable, so the first two authors coded therest of the interviews separately, meeting after every two orthree to resolve differences and update the codebook. Our goalwas to identify and discuss qualitative themes pertaining toour research questions [12], not make quantitative claims, sowe did not calculate inter-rater reliability [67]. Our codebookis in Appendix F of the supplementary materials.Ethics.This study was reviewed and approved by the Uni-versity of Maryland Institutional Review Board. We obtainedinformed consent, including for automatic transcription, andwe told participants that they could skip any question theywere uncomfortable with. We have taken care to not releaseidentifying data about participants or their organizations.Limitations.As this is a qualitative study with 17 interviews,our findings may not generalize to all product reviewers. Toour knowledge, all participants have a primarily English-speaking, U.S. or European audience. There may be self-selection bias, as monetary compensation may provide limitedincentive for busy professionals; compounding this, some or-ganizations prohibit compensation and even participation. Asa result, participants may disproportionately be enthusiasticabout product reviewing, feel comfortable talking about theirwork because they believe it meets a high standard for qualityor ethics, or want to help improve the state of the field; eachof these sentiments was expressed by multiple participants.Some participants may have over-emphasized their evalu-ation of security and privacy out of social desirability bias,especially if they believed this reflected the quality of theirTable 2: Information on the number of years participants havebeen reviewing, the number of views for a typical review, andthe number of reviewers at their primary organization.YearsViewsTeam sizeP13–530+P21–21,000–9,99930+P310–1410,000–99,9994–9P43–54–9P515–1910,000–99,99920–29P63–520–29P73–51,000–9,9991P81–210,000–99,9991P9a6–91,000–9,9992–3P9b6–91,000–9,9992–3P1040+1P1115–1910,000–99,9992–3P121–21,000–9,9991P131–21,000–9,9991P143–51,000–9,9991P1510–144–9P163–510,000–99,9994–9P1715–192–3work. To mitigate this, we stressed at the beginning of inter-views that there were no right or wrong answers. Throughoutthe interview, we repeatedly reassured participants that it wasfine not to have substantive answers to questions about howthey evaluate security and privacy, encouraging them to speakabout barriers preventing them from doing so.4.2Participant informationTable 2 contains self-reported information about participants’product reviewing experience and professional circumstances.For privacy, we report only aggregate statistics on demograph-ics in Table A2 of Appendix A.In the post-interview survey, for some additional context,participants self-reported their knowledge about online se-curity and privacy (via a question developed by Faklaris etal. [27]). They consider themselves generally knowledgeable;6 strongly agreed that “I am extremely knowledgeable aboutall the steps needed to keep my online data and accounts safe,”11 agreed, and 1 answered neutrally. We then measured par-ticipants’ online security and privacy knowledge using sixquestions from a 2019 Pew Research Center poll [104]; re-sponses are summarized in Table 3. Participants are generallyknowledgeable, answering each question with greater accu-racy than a representative sample of Americans in 2019. Theonly question answered correctly by fewer than 16 out of 18USENIX Association32nd USENIX Security Symposium 2567
Table 3: Summary of participants’ responses to security and privacy knowledge questions, compared to responses given by arepresentative sample of Americans in 2019. No participant answered “Not sure” to any question.ParticipantsU.S. public [104]Question answer (rephrased from multiple choice answer)% correct(N of 18)% correct% unsureCookies allow websites to track user visits and site activity94(17)6327Advertising is the largest source of revenue for most major social media platforms94(17)5932Privacy policies are contracts between websites and users about how a site will use user data94(17)4827“https://” in a URL means that information entered into the site is encrypted89(16)3053Phishing scams can be encountered through social media, websites, emails, and text messages94(17)6715Private browsing mode stops someone on the same computer from seeing one’s online activity56(10)2449was about the purpose of private browsing mode; 10 answeredcorrectly, while 6 held the common misconception [106] thatonline activities would be hidden from visited websites, and2 chose other incorrect answers.4.3General product reviewing practicesIn the next several sections, we describe our interview findings.For context, we report the number of interviews in which atheme or point appeared, out of a total of N = 17 (i.e., we donot double-count P9a and P9b, who review and interviewedtogether). As participants answered questions focusing ondifferent aspects of their work that they personally foundmost relevant, counts should not be taken as measuring thetrue prevalence of these themes among our participants.All participants had reviewed smart home or wearable de-vices; some also cover associated apps, standalone software,and more traditional devices including computers, phones,tablets, and routers. Five focus specifically on Apple or Home-Kit products. Our participants publish video reviews (N =4), written reviews (N = 3), or both (N = 10); some alsomentioned other formats, such as social media or podcasts.They generally described their audiences as consumers ortech enthusiasts, while P10’s audience also includes corporatebuyers.Participants evaluate products through day-to-day use inrealistic settings (N = 11) and through testing in a lab or homestudio (N = 8). Two do not evaluate products directly: P17has experience testing products but currently conducts onlineresearch and interviews users, while P2 is a product manageroverseeing review processes. Seven participants mentionedfollowing a standard evaluation framework. The length oftesting before publishing varies greatly by reviewer and byproduct, ranging from hours to a month or more. Many haveupdated reviews, generally by revising them directly (N = 10),although on YouTube reviews must be updated in other ways,such as adding a pinned comment or updating the description(N = 6). When asked how they determine what product toreview next, participants gave a variety of factors, includingpicking products that are popular (N = 9), are interesting (N= 7), or have brand presence (N = 6).Business model.Affiliate marketing (N = 13) was the mostcommon source of compensation reported, followed by ads(N = 9), sponsorships (N = 6), free products (N = 4), andsubscriptions (N = 3). While we did not prompt participantsfor measures they take to reduce potential resultant bias, ninementioned these organically, including maintaining separationbetween product reviewers and people who make business de-cisions (N = 5), avoiding sponsored reviews (N = 4), avoidingads (N = 2), buying all products themselves (N = 2), and notusing products outside of reviewing work (N = 2).4.4Security and privacy criteriaOnly four participants mentioned security or privacy whenasked what criteria they evaluate. Of these, P1 and P2, whofocus on reviewing security and privacy, listed many relevantcriteria; P9a mentioned checking for local and cloud storagebecause that was a privacy concern their audience cares about;and P8 mentioned security only to say that they do not coverit much unless there is a glaring issue.After prompting, all participants reported at least some-times evaluating security and privacy. Some always do: forsmart home devices, P1’s organization follows a testing frame-work covering authentication, encryption, security over time,and more, and P6’s organization always considers two-factorauthentication and whether data storage and processing areon the device or in the cloud. However, others rarely do: P8said, “If I see something [security- or privacy-related] thatseems glaring, I’ll call it out, but my audience really isn’tthat type.” Some organizations have separate security andprivacy experts who help with reviews (N = 6). Relatedly,five participants saw security as someone else’s job: P5 said,“Those kinds of vulnerabilities [‘gaping security flaws’] arethings that security experts find later on or get publicized asan exploit, which isn’t something I can necessarily test.” Allcriteria mentioned by at least two participants are listed inTable 4.Reviewers prioritize criteria differently.With many pos-sible security and privacy criteria, reviewers must prioritize.Eight participants mentioned tailoring priorities to the type ofproduct. Sometimes, this means prioritizing different criteria:2568 32nd USENIX Security SymposiumUSENIX Association
Table 4: All security and privacy criteria evaluated by at leasttwo participants.CountCriterion11What data is collected, and how is it shared/used?7Reputation of company/product (e.g., breaches)6Locale of data storage/processing (local or cloud)5Encryption5Known vulnerabilities5Multi-factor authentication5Physical shutters and shut-off switches4Data controls4Measures to secure data against hacking4Transparency about data handling3Authentication3Data deletion3Geographic locale of data storage/processing3Justification of data handling3Length of data retention2Full platform compatibility2Security over time2Software updates2Usable security and privacy featuresP1 said, “The most important thing for IP cameras is the au-dio data and video data. But for smart TVs, people are morecaring about if they get monitored while they’re watchingTV.” Other times, this means elevating security and privacyfor more concerning products: P6 said, “Cameras are scruti-nized a lot harder than other devices, because that’s actually agateway into your home. And like how terrifying was it twoyears ago when Ring cameras were getting hacked and actu-ally like communicating with children in their homes?” Andsometimes, this means deprioritizing security and privacy ifthey were perceived as forgone: P10 said, “If I’m reviewingsecurity cameras, it’s a big deal. If I’m reviewing a socialmedia app—you’ve thrown away your privacy.”Participants differed in the overall weight given to securityand privacy. P6 said their organization would not recommendproducts that violate a “baseline” of security, such as trans-mitting maps of a user’s home insecurely. Similarly, P17 saidthat at a previous organization, they would not recommend aproduct with missing security and privacy information, evenif it scored well on all other criteria. In contrast, other partici-pants viewed security and privacy as just two of many factors:P10 mentioned a formula with various weighted criteria thatincludes security for certain products. Still others generallydid not consider security and privacy: P15 said that given alimited word count, privacy is one of the first aspects theirorganization will cut, because functionality is more important.Reviewers sometimes recommend how to configure or useproducts.P1 guides audiences through data controls suchas opting out of data sharing, and P3 teaches audiences toprevent devices from communicating with the Internet usinga HomeKit router. P11 provides general security advice intheir reviews, emphasizing the importance of installing up-dates, using password managers, and enabling multi-factorauthentication. Given limited space, however, some create orlink to separate security and privacy content: P6’s organiza-tion provides how-to guides on deleting recordings, resettingdevices, and more. P6 views providing this advice as part ofresponsible reviewing.4.5Incentives and responsibilities for coveringsecurity and privacyThe extent to which product reviewers cover security and pri-vacy, as well as what they prioritize, is influenced by perceivedincentives, disincentives, and responsibilities.Protecting their reputation is an incentive for reviewersto evaluate security and privacy.Ten participants said theirreputation is tied to their reviews, meaning that recommendinginsecure or invasive products could hurt their credibility. P8said, “I would hate to have a glowing review about a product,and then two weeks later, they have a data breach because theydid something stupid—they didn’t encrypt something...[it]makes you look bad.” Some described this as a responsibility:P10 said not evaluating important security and privacy criteriawould be cheating their readers. However, one participantcame to the opposite conclusion: P12 said concern for theirreputation makes them hesitant to comment on security andprivacy without more expertise, because making a wrongassertion could discredit them and endanger their audience.Some see audiences as uninterested in security and privacy.Six participants cited lack of audience interest as a disincen-tive for including security and privacy in their reviews. P8said, based on the questions their audience asks, “There aredefinitely people out there that want to see [security-relatedcontent], but there’s a lot of people that don’t care.” P11 said,“If I do a video about [security vulnerabilities]...it doesn’tget searched for; it doesn’t get watched; nobody cares aboutit,” and P6 recalled creating security- and privacy-focused con-tent that received “horrible traffic.” Even when manufacturershighlight security and privacy features in reviewer guides, P4said their organization often skips them because they’re not“sexy,” adding that “No one’s really complained.”Even when reviewers do evaluate security and privacy,they may avoid reporting their findings to their audiencesdue to lack of interest. As mentioned in Section 4.4, P6’sorganization examines all smart home devices for a securitybaseline. However, they may not write about this, becausetheir audience may not understand or care. If a product doesnot meet the baseline, “We wouldn’t even be recommendingthe device in the first place.” When asked, P6 guessed thatsome but not all audience members are aware of this rule.Some audiences are interested in security and privacy.Onthe other hand, audience interest is sometimes anincentiveUSENIX Association32nd USENIX Security Symposium 2569
to evaluate security and privacy. P14 said, “Sometimes theaudience will be wondering, what if I don’t want this cameraon, or how can I protect myself?” More narrowly, P17 said, “Alot of commenters and people emailing you will be like, ‘Thisthing’s dialing into China.’” And particular audiences maycare: P2 said their organization increased their coverage ofsecurity and privacy in order to appeal to younger audiencesand people with children.Incentives against negative reviews impede publicationof security and privacy concerns.Though reviewers havesome incentives to consider security and privacy, these donot necessarily translate into incentives topublishreviews ofproducts with major concerns. Many participants said theyavoid publishing negative reviews (N = 10) or picking badproducts to review in the first place (N = 8): P11 said, “I’mnot comfortable right now with the security of [a particularIoT device], so I just won’t even [review] it.” Thus, consumersmay not receive information about security and privacy risks.Participants gave different justifications for avoiding nega-tive reviews. P16 said reviewing bad products doesn’t preventpeople from buying them and may even be counterproduc-tive by giving them more attention: “there’s still going to bepeople that buy a bad product because it’s cheap....I don’twant to give it air time.” P7 suggested consumers will avoidbad products even without the help of negative reviews, citinga poorly received product where “nobody reviewed it, andit’s not selling.” And multiple participants said consumersare less interested in negative content: P11 recalled postinga video highlighting bad products that weren’t reviewed, butthey said those types of videos did not attract much traffic.Similarly, P4 said their organization avoids publishing neg-ative reviews because people will get the message from theheadline without reading the review or buying the product,depriving them of revenue from ads and affiliate marketing.Negative reviews are important to some.Four participantsmentioned that avoiding negative reviews would decreasetheir credibility, and three felt it was their duty to publishnegative reviews that were in consumers’ best interest. Forexample, P13 said when audiences question their integritybased on the fact that they receive affiliate revenue, they “oftenpoint to other videos where I would make more money if I’dpushed that product, and I don’t.”Reviewers who do not publish negative reviews may stillmention security and privacy concerns for products that re-ceive a neutral or positive review. In a review of a device thatshared sensitive data with servers by default, P3 warned theiraudience, “If you aren’t comfortable with that, don’t get thisproduct, or here’s a way that you can limit that.” However,overall, we find that incentives against publishing negativereviews more likely dissuade reviewers from informing con-sumers of security and privacy concerns.Thorough evaluation is limited by finances.Across differentkinds of organizations, financial constraints limit security andprivacy evaluation. P7 said creating content on platforms likeYouTube takes time and often doesn’t pay well, which meansthat “putting something through its paces, like legitimatelytesting something...is just not possible.” And P1 pointedout that searching for security and privacy issues often leadsto “finding nothing in the end,” meaning that “the constraintsof the budget really make a huge difference” to whether theycan investigate a potential issue.Some aim to protect consumers by helping companies de-tect issues.Given incentives against negative reviews, someparticipants expressed a duty to protect consumers in otherways. Eight mentioned disclosing issues or vulnerabilitiesthey discover to companies. P8, who tries to avoid publishingnegative reviews, identifies as a “de facto beta tester” and hasdelayed publishing reviews until bugs they reported are fixed.They said, “If I do have a closer relationship with a company,I can maybe help drive them in a more pro-consumer direc-tion.” P7, who avoids publishing negative reviews altogether,wanted to be more like a beta tester, with earlier access toproducts so their feedback could be taken into account beforerelease. In some cases, companies may not be receptive to thisfeedback: P1 said, “Sometimes, when we find out some issueon the device—say, your device security is not up to date...the company will say, ‘No, it’s good!’” To address this, theysuggested a law to intervene when companies “ignore securityresearchers who are just being kind and trying to help.”4.6Assumptions guiding prioritizationAlong with incentives, product reviewers’ assumptions aboutsecurity and privacy play an important role in their choiceof priorities. (We defer comment on the reliability of thesebeliefs to Section 5.)Apple, HomeKit, and other intranet-based products areseen by some as secure and private.Eight participantsbelieve Apple and HomeKit-certified products to be inher-ently secure and private, including all five who focus onthose brands. For example, P16 said they did not think muchabout security, because they “just implicitly trust” HomeKit-certified products, and P7 trusted HomeKit-certified productsdespite the concerns of their audience: “You got a lot of peo-ple who don’t trust, like, new, unknown Chinese brands. AndI say, well, Apple has obviously verified, because they haveaccess to HomeKit. So, if Apple trusts them, I trust them.”Participants trust HomeKit devices in part because they areconfigurable to allow only local storage and processing, whichsignificantly limits potential attacks. P3 said, “With HomeKit,everything is like inherently secure. Stuff runs local; there’sno external server calls. If someone was trying to hack intoyour smart home devices, they’re gonna have to basically2570 32nd USENIX Security SymposiumUSENIX Association
be in your house, on your home Wi-Fi network.” Relatedly,especially for locks, P8 said they encourage their audience touse local mesh networking protocols, such as Zigbee, Z-Wave,and Thread, explaining that they are “effectively 100% secure”because a compromised device “can’t hop over to the Internetand go out and get instructions.”Participants cited a variety of other reasons for trusting Ap-ple and HomeKit. P3 trusts Apple’s certification process forHomeKit devices, describing it as “the gold standard in test-ing.” They added, “While we’ve heard a multitude of stories ofsecurity and privacy issues with Amazon assistant and GoogleAssistant products, that has never been the case for a HomeKitone.” P7 trusted HomeKit devices because of the end-to-endencryption used to transmit and store data, as well as the “kindof ridiculous” certification process. They specifically calledout the comparatively low number of HomeKit-certified de-vices, unlike other smart home platforms, where a multitudeof devices implies “super low standards.” Participants alsomentioned trusting Apple, as a company with a reputationfor security and privacy to uphold: P12 said, “Amazon andGoogle are data companies, and fundamentally that makesme not trust them as much....You are the product. Whereaswith Apple, they have a long-standing reputation that actuallyprivacy is at the center of what they do.”Due to their trust in HomeKit, some participants do notcover security and privacy unless issues arise: P7 said, “Peoplewho are watching my content understand the level of securityand privacy that comes along with HomeKit in general. So,I used to talk about it a lot, but I stopped, because there’sno need to—because everybody understands that it’s just assecure and as private as can be....Unless there’s an issue. Ifthere is an issue, I’ll talk about it.” Two participants perceiveunusual HomeKit integration as a security and privacy issueunto itself: P16 said products that require an app other thanApple Home raise red flags.Participants sometimes explain to their audience or remindthem that HomeKit devices are safe (N = 2), but they alsotry not to repeat this too much across reviews (N = 2): P12described a “tacit assumption or...maybe false hope” thataudiences would already know this from watching their othervideos. In trying not to bore their dedicated audiences, though,reviewers risk omitting important security and privacy contextfor others who happen upon their reviews and may not sharethe same expectations.Reviewers make assumptions about price and prominence.Six participants said free or cheap products typically haveprivacy trade-offs: P3 said, “If something is really affordableor has some sort of monthly free cloud option...you’rethe product.” Relatedly, P1 said their organization associatesinvestment in security and privacy features, such as encryptionand vulnerability disclosure programs, with competence: “Ifcompanies are able to put money in those areas, then we trustthat the companies are able to make the device well.”Three participants believe that prominent brands are lesslikely to have security and privacy issues; conversely, P13 tellstheir audience that bigger companies are bigger targets. Theseassumptions may influence the level of scrutiny reviewersgive to different products.Threat models inform how reviewers evaluate and com-municate security and privacy.Participants expressed manybeliefs about what threats are realistic. Three asserted thatsimple attacks, such as dictionary attacks on passwords orsimply picking a lock, are more common; this was givenas a rationale for prioritizing simple threats over complexones when evaluating security and privacy. P17 said the appfor a smart device is often more concerning than the deviceitself, adding that they wished they were better at packet sniff-ing in order to investigate these concerns. P11 argued thatby making products easy to use, companies can leave themvulnerable; in reviews, they recommend against connectingnetwork-attached storage devices to the Internet, in order toprevent ransomware and other problems. P11 also argued thatattacks by third parties (e.g., hackers distributing ransomware)are more concerning than what companies do with user data;as such, they may focus on different aspects of security andprivacy than reviewers such as P4, who only mentioned com-panies that “try to figure out ways to sneakily do things” withtheir own products as a threat.Some believe security and privacy are impossible or im-practical.Five participants said all Internet-connected prod-ucts are fundamentally insecure. Most simultaneously ac-knowledged that protective measures are still valuable: P10added, “We can put layers of trying to protect ourselves,...and I expect that from the products.” However, P5 was morefatalistic, saying anything that connects to the Internet is asecurity hole: their organization doesn’t prioritize security formost products, because vulnerabilities are “a fundamental ofthe [IoT] category and just of all network technology.”Similarly, seven participants expressed hopelessness aboutprivacy. Some argued privacy erosion is inescapable in mod-ern life generally: P16 said, “We’ve got Amazon spying on us,Google spying on us as well—it’s a global thing,” which con-tributes to their belief that it’s “not worth the hassle” to focuson privacy in reviews. Others argued that Internet-connectedproducts inherently sacrifice privacy: P9a said, “If you’resomeone that’s starting to put smart home stuff in your house,you’ve already given up on the privacy thing.”Three participants argued that protecting security and pri-vacy is technically possible but often inaccessible to most con-sumers. P8 personally believes it is important for a home net-work to isolate IoT devices, but they said doing so is beyondmost people’s capability: “You have to go buy an enterprise-level firewall; you have to segment out your Wi-Fi; you’vegotta figure out which ports you have to open; you gotta test it...it’s not easy.” Overall, these beliefs have clear potential toUSENIX Association32nd USENIX Security Symposium 2571
Table 5: Techniques and tools used by participants to evaluatesecurity and privacy.CountTechnique or tool5Ask company questions4Examine data in transit3Reading privacy policies and other docs2Static and dynamic analysis1Automated privacy policy analyzer1Check customer feedback1Document manager for privacy policies1Evaluate code and libraries1Intuition1Monitor network security using firewall1Security and privacy label1Tool to visualize network traffic1Track privacy policy updates via hashes1Verify effectiveness of security and privacy featuresshape reviewers’ judgment about the extent to which securityand privacy are worth evaluating.Since total security and privacy is impossible, it makessense that six participants emphasized tradeoffs as personaldecisions for consumers to make for themselves. They oftensaw their role as informing these decisions without makingstrong recommendations: P13 said, “I just try to lay all thatout there and then try to give my opinion, which generally errson the side of, like, ‘Hey, if you want to survive in this world,this day and age, you’re going to probably use some productsthat are connected.’” Similarly, P14 said, “You have to figureout, how much privacy do you want to give up for the featuresthat you want to get? So, yeah, that’s what I would say. Notso much as like, ‘Don’t buy this just because of that.’”4.7Techniques and toolsParticipants reported using 14 different types of techniquesand tools to evaluate security and privacy, listed in Table 5.Seven participants reported techniques and tools that involvedinteracting directly with products: e.g., using Wireshark orrouters with network monitoring capabilities to examine datain transit, or using static analysis tools to evaluate code. Inaddition to technical tools, some participants apply intuitionfrom years of experience: P6 said, “After using a lot of smarthome apps and services, you start to develop a little bit ofa spidey sense of what looks like something you want toconnect to your home network or not.” Five reported askingmanufacturers questions about products. Four said they re-view privacy policies and other written documentation, withtwo using custom tools to automatically distill important com-ponents of privacy policies or track their changes over time.On the other hand, four participants explicitly mentioned notconducting penetration testing out of practicality.Limited expertise and time impede reviewers’ use of tech-niques and tools.Participants reported numerous challenges,both to evaluating security and privacy and to reviewingInternet-connected products in general. Limited security andprivacy expertise was a common barrier (N = 11), as waslimited time and concentration (N = 6). These were oftendescribed as reasons for not evaluating security and privacyin the first place: P17 said, “If a company tells you that some-thing is two-way encrypted, I don’t have the skillset to prove itone way or the other.” P8 does not use tools such as Wiresharkand syslog, which can be used to monitor relevant device andprogram behavior, because “it takes knowledge and time.”However, some participants did mention experiencing thesechallenges while actually attempting to evaluate security andprivacy: P6 recalled their “eyes glossing over a little bit duringprivacy policy reading” early in their career, due to unfamil-iarity with “jargon.”Lack of transparency is a common barrier.Nine partici-pants cited lack of transparency or honesty from companies asa challenge, especially for techniques that rely on asking ques-tions or reading documents. P3 was reluctant to rely on Appleprivacy labels, noting instances where labels did not match be-havior. P8 said many small companies are particularly opaqueabout backend practices; conversely, P17 pointed out thatlarge companies have the resources to fight transparency. Cit-ing Apple’s history of litigation to prevent jailbreaking, theysaid, “You have to take them at their word and their reputation...they don’t have any open-source code; they don’t disclosea lot more than is necessary governmentally about their prod-ucts and how the security works on them.” P17 also gave anexample from their own experience of a device made by “oneof those like shell companies inside of a shell company insideof a shell company,” which would not answer questions aboutits data model.P1 said vague or contradictory privacy policies are difficultto evaluate: “We do see a lot of companies using really vaguelanguage and kinda talking good things in the first paragraph;then in the second paragraph they basically just contradictthemselves and then leave an open-ended loophole at the endfor themselves to do whatever they want....I feel powerlesswhen we evaluate it.”4.8Desired tools and resourcesParticipants suggested tools that would aid in security and pri-vacy evaluation. While we prompted participants to imaginetools they would personally use, we note their responses arenonetheless hypothetical.Better tools to inspect network traffic (N = 10).P17 de-scribed a tool to connect with a device and report what serversand IP addresses it contacts, what data it sends and receives,and more, comparing this to a hacker’s blog post breakingdown flaws in an IoT device. They expounded, “Here’s every-thing about this device that this little magic mouse can figure2572 32nd USENIX Security SymposiumUSENIX Association
out. ...What do you think about the fact that it’s sendingunencrypted traffic to Turkey—or to Indiana? What do youthink about the fact that it’s running a Linux kernel that’s likefive years out of date?” Other participants described morenarrowly scoped tools: P16 wanted a tool to determine if de-vices were “dialing home” to an external server, especiallyin another country, and P1 wanted to decrypt data between adevice and the Internet to see what data is sent to whom.While most did not reference existing tools, P13 wanted“a tool like Wireshark that I actually understood better.” Asa “wannabe network person” and not a security expert, theyconsider Wireshark “cool to know but not worth my time tolearn.” Ideally, they would like information to be collectedautomatically and presented in a dashboard. Similarly, P3 saidthey currently use routers to monitor which IP addresses adevice connects to and how frequently, but they also want toknow what data is being transmitted.Resources and tools to verify encryption (N = 3).P10 de-scribed a third-party organization to evaluate encryption: “Idon’t want to have to just trust the website saying this is anencrypted site. I want a certification, from my trusted over-seeing organization.” P7 wanted a tool that would not onlyverify encryption but also facilitate understanding; they calledout end-to-end encryption as “marketing fluff” that left themasking, “What does that mean?”Automated monitoring of network traffic for suspiciousbehavior (N = 4).P9a described a “firewall on steroids” thatwould use AI to detect unexpected behavior. P8 describedusing machine learning to detect concerning traffic, such as“weird traffic going to Russia” or a light switch transmitting“three gigs of data in the last 24 hours.”Assistance evaluating the full life cycle of data (N = 7).Participants wanted to know what happens to personal dataover the course of its existence, beyond network traffic. P4wanted a tool to reveal not only what data is sent where,but also where it is stored and how it is used. Similarly, P8wanted an “x-ray” into “everything that the device is doing,every stage of its operation,” including how data is stored andconsumed on the backend.Rather than a tool, four participants wanted companies todisclose this information. This drew comparison to existingdisclosure and transparency requirements: P13 suggested theFTC might implement such a program, P8 compared it to SECfilings designed to prevent insider trading, and P9a pointed tothe GDPR as an example.Some participants mentioned using privacy policies to un-derstand the data life cycle. P15 wanted a tool to analyze pri-vacy policies and other documents and produce an accessiblesummary of data collection, use, and controls, similar to Ap-ple’s privacy labels. P9b and P15 also wanted a requirementthat privacy policies be written in plain, accessible language.Labels (N = 7).Participants expressed interest in securityand privacy labels or ratings, citing as analogues the Frenchgovernment’s repairability index and UL standards for elec-tronics products. Some envisioned a single numeric score,sometimes with a link to a website with more information: P7and P17 both described a color and a number from 1 to 10indicating general security and privacy risk, and P8 wanted anumber from 1 to 4 indicating how data is stored and shared.On the other hand, one participant whose organization alreadycreates security and privacy ratings said they look forward toproviding information for labels.Automated detection of vulnerabilities (N = 3).P3 de-scribed a hypothetical tool that would produce a rating from 1to 10 on how “hackable” a product is, while P12 described atool that would show them how an attacker might try to breakinto a system and what the attacker might be able to do. P1wanted extensions to existing static analysis tools to betteranalyze firmware and iOS specifically.5DiscussionIn this section, we characterize the kind of security and privacyadvice that product reviewers provide, including suggestionsfor future research and outreach. We also recommend tech-nical tools, resources, and other changes to support productreviewers in evaluating security and privacy without requiringthem to become experts or devote significantly more time.Reviewers have mixed incentives to evaluate security andprivacy, but they are uniquely positioned to guide con-sumers to safe choices.Product reviewers are incentivizedto evaluate security and privacy in order to protect their repu-tation. However, they are disincentivized in other ways: manydo not review bad products or publish negative reviews, theymay leave out information because audiences are not inter-ested, and evaluating security and privacy is plain difficult.While our participants are generally well positioned to under-stand their current audiences, future research could exploreconsumers’ knowledge and interests more rigorously and atscale, in an effort to find more effective ways to communicatesecurity and privacy information. Some reviewers assumethat negative reviews would have little impact or even doharm by providing “air time” to bad products; prior work sug-gests this may not be true, finding that negative user reviewscan have large dissuasive effects [74,105,110]. Further dataspecifically in the contexts of security and privacy and of pro-fessional product reviews could motivate reviewers to changetheir approach, although financial incentives to avoid negativereviews would still remain.Nevertheless, product reviewers are very diverse: for ev-ery disincentive that several participants mentioned, someoneelse reported an oppositeincentive. Despite the disincentives,USENIX Association32nd USENIX Security Symposium 2573
reviewers are technical professionals who currently provideimportant security and privacy advice to consumers, even ifit is infrequent or skewed. They fill a crucial gap: given thesheer quantity of Internet-connected products, they are ar-guably the only people evaluating the security and privacy ofpopular products at all with regularity and timeliness. We seegreat potential in efforts by the research community to helpproduct reviewers guide consumers, in ways that mesh withtheir existing processes, business incentives, and resources.Reviewers cover meaningful criteria, but there are gaps.Participants reported evaluating a wide variety of security andprivacy criteria. Overall, they tend to prioritize protection ofuser data, which can be difficult for reviewers to evaluate, andto deprioritize indicators that a product can maintain securityand privacy over the long term, such as audits, update sup-port, and presence of bug bounties; these were neglected bythe published reviews we analyzed as well. These long-termindicators, however, can often be evaluated without new tech-niques or tools: e.g., by looking online or asking a companydirectly. We recommend that more product reviews includethese longer-term criteria, so that consumers know whichproducts are likely toremainfunctional and safe. Variousgroups could play a role in encouraging product reviewers toincorporate this information, including the FTC, which reg-ulates monetized content; the Digital Standard, which couldprovide information on suggested priority and ease of eval-uation in its list of criteria; and companies themselves, whocould highlight security and privacy features that distinguishtheir products from others. Importantly, reviewers may needto explain to consumerswhythese features are protective;Emami-Naeini et al. found that some people believe secu-rity audits and updates indicate poor security [23]. Productreviewers themselves may also benefit from this messaging,as we found that some may share these misconceptions: P9adescribed firmware updates as a sign of “crappy engineering”indicating that products were released before they were ready.Threat models seldom include botnets or misuse by le-gitimate users.Our participants discussed potential adver-saries including manufacturers, hackers, and governments,overwhelmingly with the goal of accessing user data. Noparticipants, and only one review we analyzed, mentionedbotnets or any other threat primarily targeting someone otherthan the product owner. However, botnets are one of the mostlikely attacks IoT device users will experience, as they areemployed at scale to conduct distributed denial of serviceattacks [6], spread spam and ransomware [68], and mine cryp-tocurrency [54].It is perhaps unsurprising that reviewers rarely considerbotnets, as their audiences may not care; these attacks oftencause little direct, observable harm to the end user. However,they may have important secondary effects, such as disablingcrucial protections in Windows Defender [68], and can bedetrimental to others and to society at large. Future researchcould measure how botnet infections affect end users, fromcollateral security and privacy harms to increased energy costsand device wear, as well as consider the effect of altruisticmessaging, in order to motivate product reviewers and endusers to consider this threat.Our participants also did not discuss preventing misuse byphysically co-located users, such as guests using smart devicesto access private information without permission. Accord-ingly, no interviewees and few analyzed reviews mentionedmulti-user access control or parental controls as a securityor privacy criterion. Again, this is unsurprising, as peopleoften deprioritize these concerns and put their trust in so-cial norms [107]. However, work by Moh et al. indicatesthis kind of everyday misuse is widespread for smart homedevices [69]. Product reviewers could help inform potentialnew users about these considerations, and they could elevateproducts with more usable and secure access controls.Defeatist attitudes toward security and privacy do notbenefit consumers.While understandable, reviewers’ beliefsthat security and privacy are impossible or impractical ob-scure the more complicated reality: there are meaningful,if imperfect, steps consumers can take to reduce risk. Sys-tematic evaluations of IoT devices have found, for example,that while many devices rate poorly on security and privacy,some do significantly better [2,63]. Product reviewers couldhelp empower consumers—who themselves frequently alsobelieve that unwanted data collection [8] and hacking [37]are inevitable and uncontrollable—by providing security andprivacy advice while acknowledging the undue burden thatfalls on consumers to protect themselves. Real-world successstories should be highlighted as evidence that security andprivacy are worth caring about; in addition to outreach fromadvocates and companies, trade shows could feature these sto-ries, as many product reviewers regularly attend events suchas CES, which brings together people interested in consumerelectronics.Security heuristics can be useful but may not be valid.Ourparticipants used heuristics, such as a manufacturer’s reputa-tion or whether a product connects to the Internet, to quicklyevaluate security and privacy concerns. These assumptionsoften have some basis in fact and allow reviewers to providesome kind of security and privacy guidance without complextools and analysis. However, they are not universally reliableand could lead to misinformed recommendations.Ideally, security and privacy labels generated by a trustwor-thy third party might supplant these heuristics and provideaccessible, reliable information for reviewers who face bar-riers doing their own evaluation. App privacy labels are astep in the right direction, but our own study and related workshow that more work is needed to earn the trust of product re-viewers and consumers. Many efforts to design IoT labels are2574 32nd USENIX Security SymposiumUSENIX Association
underway [24,39,73]; we recommend that product review-ers be considered important stakeholders who could digestinformation from an eventual label on behalf of consumers.Specifically, several participants expressed strong trust inHomeKit-certified devices. This seems partially justified: cer-tified devices are required to follow certain security practices,including in encryption [5]. Nevertheless, vulnerabilities havebeen reported in HomeKit itself [72,97], in its integrationwith other IoT management frameworks [45], and in numer-ous HomeKit-certified devices [14–17,88]. Some problemscould be avoided if end users follow reviewers’ recommenda-tions to use these devices only through the Apple Home appor disable remote processing, but it is unclear to what extentthey do so. Similarly, while other local mesh networking pro-tocols in properly configured devices should prevent manyInternet-based attacks, vulnerabilities still exist [48].Overall, more work is needed to evaluate HomeKit certifica-tion and other frameworks, such as Matter, rigorously, at scale,and as actually used in practice; this would provide impor-tant insight into when reviewers’ heuristics make sense andwhat other security and privacy criteria to prioritize. As moreinformation emerges, ensuring it is well publicized is cru-cial; anecdotally, we observed that participants were closelytuned in to news about IoT security, with many mentioningthe Wyze camera hack, Apple’s certification processes forHomeKit devices, and Matter’s planned security features.Better tools could help evaluate security and privacy.Given the strong time and audience-interest limits many re-viewers face, only some will be willing or able to indepen-dently evaluate security and privacy; for these few, it is crucialto provide tools that are extremely usable and accessible.Most commonly, participants asked for tools to help un-derstand network traffic quickly and easily, despite limitedexpertise. Many existing tools, such as packet sniffers, canbe confusing even for experts [76,103]. Instead, reviewerscould use tools that provide summaries and visualizations ofnetwork traffic [101,102], or even better, automatically flagconcerning behaviors [22,36]. Because many reviewers livewith devices for a period of time, signals of misbehavior inreal time [96] could also be useful. Further work is needed tomake tools like these usable (and maintainable) in practice,and to make reviewers aware of them.Relatedly, participants wanted to understand how data isstored, shared, and used after it leaves the home. While someexisting work attempts to evaluate these issues in a black-boxmanner [2,3], in many cases this is infeasible without com-panies making significant infrastructural changes to improvedata handling and transparency [11]. The best path forwardhere may be regulations and norms that improve transparency,responsiveness, and truthfulness from manufacturers and en-able audits to validate provided information.Some participants imagined black-box tools that wouldautomatically determine security properties, such as “hack-ability.” While achieving this in the general case is in-tractable, program analysis tools can provide relevant in-sights [29,52,56,59,77]. Like packet sniffers, however,many program analysis tools are difficult even for experts touse [93,94], and most are designed for use during softwaredevelopment. Reviewers would need new tools, designed foreasy setup without source code, that provide interpretablesnapshots of security and privacy characteristics.While no participants requested tools to help understandprivacy policies, we note that such tools could be useful toreviewers who already review policies manually. Several suchtools have been developed [9,41,84], but they may not ade-quately meet the needs of reviewers, and reviewers may notbe aware of them.6ConclusionFor this paper, we conducted 17 interviews with product re-viewers and analyzed 71 published reviews in order to under-stand the role of professional product reviewers in evaluatingthe security and privacy of Internet-connected devices andsoftware. We characterize the criteria and threat models re-viewers consider, as well as how they fit into their review pro-cess. We find that while reviewers have incentives to considersecurity and privacy, they face significant disincentives andchallenges, including audience disinterest, lack of expertise,potentially unreliable assumptions, and a dearth of effectiveand usable tools for evaluation. We make recommendationsfor further research and tool development to support productreviewers, within the constraints of their practices and incen-tives, in shifting the burden of security and privacy evaluationaway from end users and toward professionals.AcknowledgmentsWe thank our participants for their time and insights, our re-viewers for their constructive feedback, and all who gave usadvice or helped with recruitment, including Nora McDonald,who helped contextualize our qualitative analysis. This re-search was supported by the SPLICE research program underNSF SaTC award #19555805.References[1]Omer Akgul, Richard Roberts, Moses Namara, Dave Levin, and Michelle L.Mazurek. Investigating influencer VPN ads on YouTube. InProceedingsof the 2022 IEEE Symposium on Security and Privacy, pages 876–892,May 2022. https://doi.org/10.1109/SP46214.2022.9833633.[2]Omar Alrawi, Chaz Lever, Manos Antonakakis, and Fabian Monrose.SoK: Security evaluation of home-based IoT deployments.In2019IEEE Symposium on Security and Privacy, pages 1362–1380, May 2019.https://www.doi.org/10.1109/SP.2019.00013.[3]Omar Alrawi, Chaoshun Zuo, Ruian Duan, Ranjita Pai Kasturi, ZhiqiangLin, and Brendan Saltaformaggio. The betrayal at Cloud City: An empiricalanalysis of cloud-based mobile backends.InProceedings of the 28thUSENIX Security Symposium, pages 551–566, August 2019. https://www.usenix.org/conference/usenixsecurity19/presentation/alrawi.USENIX Association32nd USENIX Security Symposium 2575
[4]Edgar Alvarez.YouTube stars are blurring the lines between contentand ads.Engadget, July 2017. https://www.engadget.com/2017-07-25-youtube-influencers-sponsored-videos.html.[5]Mahmoud Ammar, Giovanni Russello, and Bruno Crispo.Internet ofThings: A survey on the security of IoT frameworks.Journal of In-formation Security and Applications, 38:8–27, February 2018.https://www.sciencedirect.com/science/article/pii/S2214212617302934.[6]Manos Antonakakis, Tim April, Michael Bailey, Matt Bernhard, ElieBursztein, Jaime Cochran, Zakir Durumeric, J. Alex Halderman, LucaInvernizzi, Michalis Kallitsis, Deepak Kumar, Chaz Lever, Zane Ma,Joshua Mason, Damian Menscher, Chad Seaman, Nick Sullivan, KurtThomas, and Yi Zhou.Understanding the Mirai botnet.InProceed-ings of the 26th USENIX Security Symposium, pages 1093–1110, Au-gust 2017. https://www.usenix.org/conference/usenixsecurity17/technical-sessions/presentation/antonakakis.[7]Apple. Privacy: Labels. https://www.apple.com/privacy/labels/.[8]Brooke Auxier, Lee Rainie, Monica Anderson, Andrew Perrin, MadhuKumar, and EricaTurner.Americansand privacy:Concerned,confused and feelinglack of control over their personal informa-tion.Technicalreport, PewResearchCenter, November 2019.https://www.pewresearch.org/internet/2019/11/15/americans-and-privacy-concerned-confused-and-feeling-lack-of-control-over-their-personal-information/.[9]Vinayshekhar Bannihatti Kumar, Roger Iyengar, Namita Nisal, YuanyuanFeng, Hana Habib, Peter Story, Sushain Cherivirala, Margaret Hagan, Lor-rie Cranor, Shomir Wilson, Florian Schaub, and Norman Sadeh. Finding achoice in a haystack: Automatic extraction of opt-out statements from pri-vacy policy text. InProceedings of The Web Conference 2020, WWW ’20,pages 1943–1954, April 2020. https://doi.org/10.1145/3366423.3380262.[10]TanyaBasu.Anti-vaxxersareweaponizingYelptopunishbarsthat requirevaccineproof.MITTechnology Review, June2021.https://www.technologyreview.com/2021/06/12/1026213/anti-vaxxers-negative-yelp-google-reviews-restaurants-bars/.[11]Eleanor Birrell, Anders Gjerdrum, Robbert van Renesse, Håvard Johansen,Dag Johansen, and Fred B. Schneider. SGX enforcement of use-basedprivacy. InProceedings of the 2018 Workshop on Privacy in the ElectronicSociety, WPES’18, pages 155–167, October 2018. https://doi.org/10.1145/3267323.3268954.[12]Virginia Braun and Victoria Clarke. Using thematic analysis in psychology.Qualitative Research in Psychology, 3(2):77–101, 2006. https://doi.org/10.1191/1478088706qp063oa.[13]Eliza Brooke.Why so many recommendation sites promise to helpyou find the best stuff.Vox, December 2018. https://www.vox.com/the-goods/2018/12/11/18131224/recommendations-best-strategist-wirecutter-buzzfeed-reviews.[14]Robert Byers, Chris Turner, and Tanya Brewer. CVE-2020-6007. https://nvd.nist.gov/vuln/detail/CVE-2020-6007, August 2020.[15]Robert Byers, Chris Turner, and Tanya Brewer. CVE-2021-27954. https://nvd.nist.gov/vuln/detail/CVE-2021-27954, August 2021.[16]Robert Byers, Chris Turner, and Tanya Brewer. CVE-2021-35067. https://nvd.nist.gov/vuln/detail/CVE-2021-35067, October 2021.[17]Robert Byers, Chris Turner, and Tanya Brewer. CVE-2022-27152. https://nvd.nist.gov/vuln/detail/CVE-2022-27152, April 2022.[18]Rachel Cericola. How Wirecutter vets the security and privacy of smarthome devices.https://www.nytimes.com/wirecutter/blog/smart-home-security-privacy/, September 2020.[19]CTIA Certification. IoT cybersecurity certification. https://ctiacertification.org/program/iot-cybersecurity-certification/.[20]Federal Trade Commission. The FTC’s endorsement guides: What peo-ple are asking.https://www.ftc.gov/business-guidance/resources/ftcs-endorsement-guides-what-people-are-asking, August 2020.[21]Cyber SecurityAgencyof Singapore.CybersecurityLabellingScheme. https://www.csa.gov.sg/Programmes/certification-and-labelling-schemes/cybersecurity-labelling-scheme/about-cls.[22]Sohaila Eltanbouly, May Bashendy, Noora AlNaimi, Zina Chkirbene, andAiman Erbad. Machine learning techniques for network anomaly detection:A survey. InProceedings of the 2020 IEEE International Conference onInformatics, IoT, and Enabling Technologies, pages 156–162, February2020. https://doi.org/10.1109/ICIoT48696.2020.9089465.[23]Pardis Emami-Naeini, Janarth Dheenadhayalan, Yuvraj Agarwal, and Lor-rie Faith Cranor. Which privacy and security attributes most impact con-sumers’ risk perception and willingness to purchase IoT devices?InProceedings of the 2021 IEEE Symposium on Security and Privacy, pages519–536, May 2021. https://ieeexplore.ieee.org/document/9519463/.[24]Pardis Emami-Naeini, Janarth Dheenadhayalan, Yuvraj Agarwal, and Lor-rie Faith Cranor. An informative security and privacy “nutrition” label forInternet of Things devices.IEEE Security & Privacy, 20(2):31–39, 2022.https://doi.org/10.1109/MSEC.2021.3132398.[25]Pardis Emami-Naeini, Henry Dixon, Yuvraj Agarwal, and Lorrie FaithCranor. Exploring how privacy and security factor into IoT device purchasebehavior. InProceedings of the 2019 CHI Conference on Human Factorsin Computing Systems, pages 1–12, May 2019. https://dl.acm.org/doi/10.1145/3290605.3300764.[26]Engadget. Engadget’s guide to privacy. https://www.engadget.com/buyers-guide/personal-security/.[27]Cori Faklaris, Laura A. Dabbish, and Jason I. Hong. A self-report measureof end-user security attitudes (SA-6).InProceedings of the FifteenthSymposium on Usable Privacy and Security, pages 61–77, August 2019.https://www.usenix.org/conference/soups2019/presentation/faklaris.[28]Todd Feathers.Civil rightsgroupswant tech sitestostop re-viewingAmazon’sRingcameras.Vice, March 2021.https://www.vice.com/en/article/z3vpw3/civil-rights-groups-want-tech-sites-to-stop-reviewing-amazons-ring-cameras.[29]Bo Feng, Alejandro Mera, and Long Lu. P2IM: Scalable and hardware-independent firmware testing via automatic peripheral interface modeling.InProceedings of the 29th USENIX Security Symposium, pages 1237–1254,2020. https://www.usenix.org/conference/usenixsecurity20/presentation/feng.[30]Fight for the Future. Rescind Ring. https://rescindring.com, 2022.[31]Raffaele Filieri, Charles F. Hofacker, and Salma Alguezaui. What makesinformation in online consumer reviews diagnostic over time? The roleof review relevancy, factuality, currency, source credibility and rankingscore.Computers in Human Behavior, 80:122–131, March 2018. https://doi.org/10.1016/j.chb.2017.10.039.[32]FinnishTransportandCommunicationsAgency.Finlandbe-comesthefirst Europeancountrytocertifysafesmart devices:NewCybersecuritylabelhelpsconsumersbuysaferproducts.https://www.traficom.fi/en/news/finland-becomes-first-european-country-certify-safe-smart-devices-new-cybersecurity-label, November2019.[33]Sara Fischer.Exclusive: WSJ debuts new commerce site “Buy Side”.Axios, June 2022. https://www.axios.com/2022/06/11/wsj-new-commerce-site-buy-side.[34]Mozilla Foundation. *Privacy Not Included: A buyer’s guide for connectedproducts. https://foundation.mozilla.org/en/privacynotincluded/.[35]Suzanne Frey.Get more information about your apps in Google Play.https://blog.google/products/google-play/data-safety/, April 2022.[36]Chenglong Fu, Qiang Zeng, and Xiaojiang Du. HAWatcher: Semantics-aware anomaly detection for appified smart homes. InProceedings of the30th USENIX Security Symposium, pages 4223–4240, 2021. https://www.usenix.org/conference/usenixsecurity21/presentation/fu-chenglong.[37]Kelsey R. Fulton, Rebecca Gelles, Alexandra McKay, Yasmin Abdi,Richard Roberts, and Michelle L. Mazurek. The effect of entertainmentmedia on mental models of computer security. InProceedings of the Fif-teenth Symposium on Usable Privacy and Security, pages 79–95, August2019. https://www.usenix.org/conference/soups2019/presentation/fulton.[38]Thomas Germain. Mental health apps aren’t all as private as you maythink.https://www.consumerreports.org/health-privacy/mental-health-apps-and-user-privacy-a7415198244/, March 2021.[39]GOV.UK.Newsmartdevicescyber securitylawsonestepcloser. https://www.gov.uk/government/news/new-smart-devices-cyber-security-laws-one-step-closer, January 2022.[40]Greg Guest, Arwen Bunce, and Laura Johnson. How many interviews areenough?: An experiment with data saturation and variability.Field Methods,18(1):59–82, February 2006. https://doi.org/10.1177/1525822X05279903.[41]Wentao Guo, Jay Rodolitz, and Eleanor Birrell. Poli-see: An interactivetool for visualizing privacy policies. InProceedings of the 19th Workshopon Privacy in the Electronic Society, pages 57–71, November 2020. http://doi.org/10.1145/3411497.3420221.[42]Taylor Hatmaker. Amazon sues Facebook group admins over fake reviews.TechCrunch, July 2022. https://social.techcrunch.com/2022/07/18/amazon-lawsuit-fake-reviews-facebook/.2576 32nd USENIX Security SymposiumUSENIX Association
[43]Nick Ho-Sam-Sooi, Wolter Pieters, and Maarten Kroesen. Investigating theeffect of security and privacy on IoT device purchase behaviour.Computers& Security, 102:1–12, March 2021. https://doi.org/10.1016/j.cose.2020.102132.[44]ioXt. ioXt Certification for IoT products. https://www.ioxtalliance.org/get-ioxt-certified.[45]Yan Jia, Bin Yuan, Luyi Xing, Dongfang Zhao, Yifan Zhang, XiaoFengWang, Yijing Liu, Kaimin Zheng, Peyton Crnjak, Yuqing Zhang, DeqingZou, and Hai Jin. Who’s in control? On security risks of disjointed IoTdevice management channels. InProceedings of the 2021 ACM SIGSACConference on Computer and Communications Security, CCS ’21, pages1289–1305, November 2021. http://doi.org/10.1145/3460120.3484592.[46]Shane D. Johnson, John M. Blythe, Matthew Manning, and Gabriel T. W.Wong. The impact of IoT security labelling on consumer product choiceand willingness to pay.PLoS ONE, 15(1):1–21, January 2020.https://journals.plos.org/plosone/article?id=10.1371/journal.pone.0227800.[47]Ryan Kailath. Some Amazon reviews are too good to be believed. They’repaid for.NPR, July 2018. https://www.npr.org/2018/07/30/629800775/some-amazon-reviews-are-too-good-to-be-believed-theyre-paid-for.[48]Georgios Kambourakis, Constantinos Kolias, Dimitrios Geneiatakis, Geor-gios Karopoulos, Georgios Michail Makrakis, and Ioannis Kounelis. Astate-of-the-art review on the security of mainstream IoT wireless PANprotocol stacks.Symmetry, 12(4):1–29, April 2020. https://doi.org/10.3390/sym12040579.[49]Hean Tat Keh and Jin Sun. The differential effects of online peer reviewand expert review on service evaluations: The roles of confidence andinformation convergence.Journal of Service Research, 21(4):474–489,2018. https://doi.org/10.1177/1094670518779456.[50]Patrick Gage Kelley, Joanna Bresee, Lorrie Faith Cranor, and Robert W.Reeder.A “nutrition label” for privacy.InProceedings of the 5thSymposium on Usable Privacy and Security, pages 1–12, July 2009.https://doi.org/10.1145/1572532.1572538.[51]Patrick Gage Kelley, Lorrie Faith Cranor, and Norman Sadeh. Privacyas part of the app decision-making process. InProceedings of the 31stAnnual CHI Conference on Human Factors in Computing Systems, pages3393–3402, April 2013. https://doi.org/10.1145/2470654.2466466.[52]Mingeun Kim, Dongkwan Kim, Eunsoo Kim, Suryeon Kim, YeongjinJang, and Yongdae Kim. FirmAE: Towards large-scale emulation of IoTfirmware for dynamic analysis. InProceedings of the Annual ComputerSecurity Applications Conference, ACSAC ’20, pages 733–745, December2020. https://doi.org/10.1145/3427228.3427294.[53]Konrad Kollnig, Anastasia Shuba, Max Van Kleek, Reuben Binns, andNigel Shadbolt.Goodbye tracking? Impact of iOS app tracking trans-parency and privacy labels. In2022 ACM Conference on Fairness, Ac-countability, and Transparency, FAccT ’22, pages 508–520, June 2022.http://doi.org/10.1145/3531146.3533116.[54]Radhesh Krishnan Konoth, Emanuele Vineti, Veelasha Moonsamy, Mar-tina Lindorfer, Christopher Kruegel, Herbert Bos, and Giovanni Vigna.Minesweeper: An in-depth look into drive-by cryptocurrency mining andits defense. InProceedings of the 2018 ACM SIGSAC Conference on Com-puter and Communications Security, CCS ’18, pages 1714–1730, October2018. https://doi.org/10.1145/3243734.3243858.[55]Klaus Krippendorff. Reliability in content analysis: Some common mis-conceptions and recommendations.Human Communication Research,30(3):411–433, 2004. https://doi.org/10.1111/j.1468-2958.2004.tb00738.x.[56]Melina Kulenovic and Dzenana Donko. A survey of static code analysismethods for security vulnerabilities detection. InProceedings of the 201437th International Convention on Information and Communication Tech-nology, Electronics and Microelectronics, pages 1381–1386, May 2014.https://doi.org/10.1109/MIPRO.2014.6859783.[57]Victor Le Pochat, Tom Van Goethem, Samaneh Tajalizadehkhoob, Ma-ciej Korczynski, and Wouter Joosen.Tranco: A research-oriented topsites ranking hardened against manipulation. InProceedings of the 2019Network and Distributed System Security Symposium, February 2019.https://www.doi.org/10.14722/ndss.2019.23386.[58]Young-Jin Lee and Yong Tan. Effects of different types of free trials andratings in sampling of consumer software: An empirical study.Journal ofManagement Information Systems, 30(3):213–246, 2013. https://doi.org/10.2753/MIS0742-1222300308.[59]Li Li, Tegawendé F. Bissyandé, Mike Papadakis, Siegfried Rasthofer,Alexandre Bartel, Damien Octeau, Jacques Klein, and Le Traon. Staticanalysis of Android apps: A systematic literature review.Information andSoftware Technology, 88:67–95, August 2017. https://doi.org/10.1016/j.infsof.2017.04.001.[60]Mengxiang Li, Liqiang Huang, Chuan-Hoo Tan, and Kwok-Kee Wei. Help-fulness of online product reviews as seen by consumers: Source and contentfeatures.International Journal of Electronic Commerce, 17(4):101–136,2013. https://doi.org/10.2753/JEC1086-4415170404.[61]Tianshi Li, Kayla Reiman, Yuvraj Agarwal, Lorrie Faith Cranor, and Jason I.Hong. Understanding challenges for developers to create accurate privacynutrition labels. InProceedings of the 2022 CHI Conference on HumanFactors in Computing Systems, CHI ’22, pages 1–24, April 2022. https://doi.org/10.1145/3491102.3502012.[62]Yucheng Li, Deyuan Chen, Tianshi Li, Yuvraj Agarwal, Lorrie Faith Cra-nor, and Jason I. Hong. Understanding iOS privacy nutrition labels: Anexploratory large-scale analysis of App Store data. InExtended Abstractsof the 2022 CHI Conference on Human Factors in Computing Systems, CHIEA ’22, pages 1–7, April 2022. https://doi.org/10.1145/3491101.3519739.[63]Franco Loi, Arunan Sivanathan, Hassan Habibi Gharakheili, Adam Rad-ford, and Vijay Sivaraman. Systematically evaluating security and pri-vacy for consumer IoT devices. InProceedings of the 2017 Workshopon Internet of Things Security and Privacy, pages 1–6, November 2017.https://doi.org/10.1145/3139937.3139938.[64]Xueming Luo, Bin Gu, Jie Zhang, and Chee Wei Phang. Expert blogs andconsumer perceptions of competing brands.MIS Quarterly, 41(2):371–396,June 2017. https://ssrn.com/abstract=2268209.[65]Arunesh Mathur, Arvind Narayanan, and Marshini Chetty. Endorsementson social media: An empirical study of affiliate marketing disclosures onYouTube and Pinterest.Proceedings of the ACM on Human-ComputerInteraction, 2(CSCW):1–26, November 2018. https://dl.acm.org/doi/10.1145/3274388.[66]Aleecia M. McDonald and Lorrie Faith Cranor. The cost of reading pri-vacy policies.I/S: A Journal of Law and Policy for the Information Soci-ety, 4(3):543–568, 2008. https://heinonline.org/HOL/P?h=hein.journals/isjlpsoc4&i=563.[67]Nora McDonald, Sarita Schoenebeck, and Andrea Forte. Reliability andinter-rater reliability in qualitative research: Norms and guidelines forCSCW and HCI practice.Proceedings of the ACM on Human-ComputerInteraction, 3(CSCW):72:1–72:23, November 2019.https://doi.org/10.1145/3359174.[68]Microsoft 365 Defender Threat Intelligence Team. Phorpiex morphs: Howa longstanding botnet persists and thrives in the current threat environment.https://www.microsoft.com/security/blog/2021/05/20/phorpiex-morphs-how-a-longstanding-botnet-persists-and-thrives-in-the-current-threat-environment/, May 2021.[69]Phoebe Moh, Pubali Datta, Noel Warford, Adam Bates, Nathan Malkin,and Michelle L Mazurek. Characterizing everyday misuse of smart homedevices. InProceedings of the 2023 IEEE Symposium on Security andPrivacy, May 2023. https://doi.org/10.1109/SP46215.2023.00089.[70]Christina Morales. Restaurants face an extortion threat: A bad rating onGoogle.The New York Times, July 2022. https://www.nytimes.com/2022/07/11/dining/google-one-star-review-scam-restaurants.html.[71]Sara Morrison.Amazon’s Ring privacy problem is back.Vox,July 2022. https://www.vox.com/recode/23207072/amazon-ring-privacy-police-footage.[72]Nathaniel Mott. ‘DoorLock’ vulnerability can force iOS devices to end-lessly reboot.PCMag, January 2022.https://www.pcmag.com/news/doorlock-vulnerability-can-force-ios-devices-to-endlessly-reboot.[73]National Institute of Standards and Technology. Recommended criteriafor cybersecurity labeling for consumer Internet of Things (IoT) products.NIST CSWP 24, National Institute of Standards and Technology, February2022.https://csrc.nist.gov/publications/detail/white-paper/2022/02/04/criteria-for-cybersecurity-labeling-for-consumer-iot-products/final.[74]Anna Naujoks and Martin Benkenstein.Who is behind the message?The power of expert reviews on eWOM platforms.Electronic CommerceResearch and Applications, 44:1–10, 2020. https://doi.org/10.1016/j.elerap.2020.101015.[75]Nataliia Neshenko, Elias Bou-Harb, Jorge Crichigno, Georges Kaddoum,and Nasir Ghani. Demystifying IoT security: An exhaustive survey on IoTvulnerabilities and a first empirical look on Internet-scale IoT exploitations.IEEE Communications Surveys & Tutorials, 21(3):2702–2733, 2019. https://doi.org/10.1109/COMST.2019.2910750.[76]Erik E. Northrop and Heather R. Lipford. Exploring the usability of openUSENIX Association32nd USENIX Security Symposium 2577
source network forensic tools. InProceedings of the 2014 ACM Workshopon Security Information Workers, SIW ’14, pages 1–8, November 2014.https://doi.org/10.1145/2663887.2663903.[77]Paulo Nunes, Ibéria Medeiros, José Fonseca, Nuno Neves, Miguel Correia,and Marco Vieira. An empirical study on combining diverse static analysistools for web security vulnerabilities based on development scenarios.Computing, 101:161–185, 2019. https://doi.org/10.1007/s00607-018-0664-z.[78]LauraHazardOwen.Wirecutter, whichmakesmoneywhenyoushop, isgoingbehindTheNewYorkTimes’paywall.https://www.niemanlab.org/2021/08/wirecutter-which-makes-money-when-you-shop-is-going-behind-the-new-york-times-paywall/, August2021.[79]Do-Hyung Park.Consumer adoption of consumer-created vs. expert-created information: Moderating role of prior product attitude.Sustainabil-ity, 13(4):2024, January 2021. https://doi.org/10.3390/su13042024.[80]Daria Plotkina and Andreas Munzel. Delight the experts, but never dis-satisfy your customers! A multi-category study on the effects of onlinereview source on intention to buy a new product.Journal of Retailing andConsumer Services, 29:1–11, 2016. https://dx.doi.org/10.1016/j.jretconser.2015.11.002.[81]Molly Price and David Priest. Best smart locks of 2021: August, Yale,Schlage and more.CNET, November 2021. https://www.cnet.com/home/security/best-smart-locks/.[82]Mike Prospero. Best video doorbells in 2022: Top smart doorbell camerasrated.Tom’s Guide, January 2022. https://www.tomsguide.com/us/best-video-doorbells,review-4468.html.[83]Reethika Ramesh, Anjali Vyas, and Roya Ensafi. “All of them claim to bethe best”: Multi-perspective study of VPN users and VPN providers. InProceedings of the 32nd USENIX Security Symposium, May 2023. https://www.usenix.org/conference/usenixsecurity23/presentation/ramesh.[84]Abhilasha Ravichander, Alan W Black, Thomas Norton, Shomir Wilson,and Norman Sadeh. Breaking down walls of text: How can NLP benefitconsumer privacy?InProceedings of the 59th Annual Meeting of theAssociation for Computational Linguistics and the 11th International JointConference on Natural Language Processing, volume 1, pages 4125–4140,August 2021. https://doi.org/10.18653/v1/2021.acl-long.319.[85]Joel Reardon, Álvaro Feal, Primal Wijesekera, Amit Elazari Bar On, NarseoVallina-Rodriguez, and Serge Egelman. 50 ways to leak your data: Anexploration of apps’ circumvention of the Android permissions system.InProceedings of the 28th USENIX Security Symposium, pages 603–620, August 2019. https://www.usenix.org/conference/usenixsecurity19/presentation/reardon.[86]Consumer Reports. The Digital Standard. https://thedigitalstandard.org/standard/, 2020.[87]ConsumerReports.Guidetodigitalsecurity&privacy.https://www.consumerreports.org/digital-security/online-security-and-privacy-guide/, July 2022.[88]Daniel Romero.Technical advisory:MultiplevulnerabilitiesinNukismartlocks(CVE-2022-32509,CVE-2022-32504,CVE-2022-32502,CVE-2022-32507,CVE-2022-32503,CVE-2022-32510,CVE-2022-32506,CVE-2022-32508,CVE-2022-32505).https://research.nccgroup.com/2022/07/25/technical-advisory-multiple-vulnerabilities-in-nuki-smart-locks-cve-2022-32509-cve-2022-32504-cve-2022-32502-cve-2022-32507-cve-2022-32503-cve-2022-32510-cve-2022-32506-cve-2022-32508-cve-2/, July 2022.[89]Dan Seifert.Nest Thermostat review: More simple than smart.TheVerge, December 2020. https://www.theverge.com/21725036/google-nest-thermostat-2020-review.[90]Similarweb. https://www.similarweb.com/.[91]Ben Smith. You’ve never heard of the biggest digital media company inAmerica.The New York Times, August 2021. https://www.nytimes.com/2021/08/15/business/media/red-ventures-digital-media.html.[92]Dale Smith.Ring Video Doorbell 4 review:A competent gad-getfromacompanywithashakyreputation.CNET, July2021. https://www.cnet.com/home/security/ring-video-doorbell-4-review-a-competent-gadget-from-a-company-with-a-shaky-reputation/.[93]Justin Smith, Lisa Nguyen Quang Do, and Emerson Murphy-Hill. Whycan’t Johnny fix vulnerabilities: A usability evaluation of static analysistools for security. InProceedings of the Sixteenth Symposium on UsablePrivacy and Security, pages 221–238, August 2020. https://www.usenix.org/conference/soups2020/presentation/smith.[94]Justin Smith, Brittany Johnson, Emerson Murphy-Hill, Bill Chu, andHeather Richter Lipford.How developers diagnose potential securityvulnerabilities with a static analysis tool.IEEE Transactions on SoftwareEngineering, 45(9):877–897, September 2019. https://doi.org/10.1109/TSE.2018.2810116.[95]UL Solutions. UL verified IoT device security rating. https://www.ul.com/services/ul-verified-iot-device-security-rating.[96]Parth Kirankumar Thakkar, Shijing He, Shiyu Xu, Danny Yuxing Huang,and Yaxing Yao. “It would probably turn into a social faux-pas”: Users’and bystanders’ preferences of privacy awareness mechanisms in smarthomes. InProceedings of the 2022 CHI Conference on Human Factors inComputing Systems, CHI ’22, pages 1–13, April 2022. https://doi.org/10.1145/3491102.3502137.[97]Khaos Tian. Your home was not so secure after all. https://medium.com/hackernoon/your-home-was-not-so-secure-after-all-af52fbd6777c, De-cember 2017.[98]Jeffrey A. Trachtenberg.Gannett invests to boost product-review site,hoping to rival New York Times’s Wirecutter.The Wall Street Journal,July 2021. http://www.wsj.com/articles/gannett-invests-to-boost-product-review-site-hoping-to-rival-new-york-timess-wirecutter-11625407202.[99]Janice Y. Tsai, Serge Egelman, Lorrie Cranor, and Alessandro Acquisti.The effect of online privacy information on purchasing behavior: An exper-imental study.Information Systems Research, 22(2):254–268, June 2011.https://pubsonline.informs.org/doi/abs/10.1287/isre.1090.0260.[100]Joseph Turow, Yphtach Lelkes, Nora A. Draper, and Ari Ezra Waldman.Americans can’t consent to companies’ use of their data. Technical re-port, Annenberg School for Communication, University of Pennsylva-nia, February 2023. https://www.asc.upenn.edu/sites/default/files/2023-02/Americans_Can%27t_Consent.pdf.[101]Juraj Uhlár, Martin Holkoviˇc, and Vít Rusˇnák. PCAPFunnel: A tool forrapid exploration of packet capture files. InProceedings of the 2021 25thInternational Conference Information Visualisation, pages 69–76, July2021. https://doi.org/10.1109/IV53921.2021.00021.[102]Alex Ulmer, David Sessler, and Jörn Kohlhammer. NetCapVis: Web-basedprogressive visual analytics for network packet captures. InProceedingsof the 2019 IEEE Symposium on Visualization for Cyber Security, pages1–10, October 2019. https://doi.org/10.1109/VizSec48167.2019.9161633.[103]Fábio Luciano Verdi, Hélio Tibagí de Oliveira, Leobino N. Sampaio, andLuciana A. M. Zaina. Usability matters: A human–computer interactionstudy on network management tools.IEEE Transactions on Network andService Management, 17(3):1865–1878, April 2020. https://doi.org/10.1109/TNSM.2020.2987036.[104]Emily A. Vogels and Monica Anderson. Americans and digital knowl-edge. Technical report, Pew Research Center, October 2019. https://www.pewresearch.org/internet/2019/10/09/americans-and-digital-knowledge/.[105]Bettina von Helversen, Katarzyna Abramczuk, Wiesław Kope´c, and Ra-doslaw Nielek. Influence of consumer reviews on online purchasing de-cisions in older and younger adults.Decision Support Systems, 113:1–10, September 2018. https://www.sciencedirect.com/science/article/pii/S0167923618300861.[106]Yuxi Wu, Panya Gupta, Miranda Wei, Yasemin Acar, Sascha Fahl, andBlase Ur.Your secrets are safe: How browsers’ explanations impactmisconceptions about private browsing mode. InProceedings of the 2018World Wide Web Conference, WWW ’18, pages 217–226, April 2018. http://doi.org/10.1145/3178876.3186088.[107]Eric Zeng and Franziska Roesner. Understanding and improving secu-rity and privacy in multi-user smart homes: A design exploration andin-home user study. InProceedings of the 28th USENIX Security Sympo-sium, pages 159–176, August 2019. https://www.usenix.org/conference/usenixsecurity19/presentation/zeng.[108]Shikun Zhang, Yuanyuan Feng, Yaxing Yao, Lorrie Faith Cranor, andNorman Sadeh.How usable are iOS app privacy labels?Proceed-ings on Privacy Enhancing Technologies, 2022(4):204–228, 2022. https://doi.org/10.56553/popets-2022-0106.[109]Wenqi Zhou and Wenjing Duan. Do professional reviews affect online userchoices through user reviews? An empirical study.Journal of Manage-ment Information Systems, 33(1):202–228, 2016. https://doi.org/10.1080/07421222.2016.1172460.[110]Marc Ziegele and Mathias Weber. Example, please! Comparing the effectsof single customer reviews and aggregate review scores on online shoppers’product evaluations.Journal of Consumer Behaviour, 14(2):103–114, 2015.http://onlinelibrary.wiley.com/doi/abs/10.1002/cb.1503.2578 32nd USENIX Security SymposiumUSENIX Association
ATablesTable A1: Sources of the reviews we analyzed.Text ReviewsVideo ReviewsCNET7CNET3PCMag6Home Tech Decisions3Tom’s Guide4Smart Home Solver3BestReviews3Life Hackster2Wirecutter3The 5 Best2Consumer Reports2Top 5 Picks2Reviews.org2Apple Insider1SafeHome.org2Automate Your Life Shorts1TechRadar2Detroit Tech1The Verge2Everyday Chris1Android Authority1James1Digital Trends1Locksmith Recommended1Reviewed1Modern Dad1Reviews.com1One Hour Smart Home1Safewise1Rizknows1The Guardian1Security.org1The Smart Cave1Serg Tech1Trusted Reviews1Steve Does1Tech With Brett1Techs You Can’t LiveWithout1Terry White1Table A2: Demographics of our interview participants, re-ported in aggregate.CountAge18–24125–34435–44745–54255–64265+2GenderFemale3Male15RaceAsian1White16Other1EducationHigh school1(completeBachelor’s degree12or pursuing)Graduate or professional degree4Prefer not to state1CountryUnited States16Other2BInterview protocolBig pictureI’ll start with some background questions. If we can try to do this rapid-fire,that’ll let us get to the main part of our interview faster.1.What is your current role, professionally, when it comes to reviewingproducts?2.Would you please describe the organization or publication for whichyou review products? For example, who is the audience?3.What is the business model with respect to reviews? For example,affiliate marketing, sponsorships, ads, and subscriptions.4.Do you publish written reviews, video reviews, something else, or amix of formats?5. How long have you been doing similar work reviewing products?6.Would you please describe briefly what kinds of products you currentlyreview?7.Do you have a process for determining which products to review next?8.In what setting do you review products? For example, in a lab, in yourown home, or somewhere else?9.How long do you typically test or use a product before publishing areview?10. Do you ever revise or update a review after it is published?CriteriaNow I’d like to talk about criteria that you may use for evaluating a product:in other words, information about a product that you use to judge its quality.1.When reviewing a product, what kinds of criteria do you write aboutor take into account?2.When reviewing a product, do you consider criteria related to securityand privacy, and how do you prioritize them compared to other criteriasuch as a product’s features and cost?3.What are some of the most important security- and privacy-relatedcriteria that you consider when reviewing a product?4.Are there other security- and privacy-related criteria that you considerimportant but don’t evaluate?Techniques and toolsNow, let’s pivot to the techniques and tools you use to evaluate products: inother words, what you do when testing a product, and how you do it. We’rethinking about this broadly; it could be anything from reading a document tousing a software tool to analyze a device.1.How do you learn about techniques and tools for evaluating productsin general?2.What techniques and tools, if any, do you use to evaluate security andprivacy?3.For a minute, let’s pretend you could have any tools you wanted tohelp you review the security and privacy of Internet-connected devices,apps, or other software. Don’t worry about how they might technicallywork; think of a tool as a black box where you know how to set it upand what information about a product it will tell you. Can you thinkof any tools that you would use? What would they tell you about aproduct? How would you use them?4.Now, let’s pretend you could institute any regulations or industry-widepractices you wanted to help you review the security and privacy ofInternet-connected devices, apps, or other software. Can you think ofany that you would make use of? How would they change the way youreview products?5.Can you remember any other times that you ran into difficulties whentrying to use a technique or tool to evaluate security and privacy?6.Are there other techniques or tools that you’re aware of for evaluatingsecurity and privacy that you would like to use but don’t?Communication, impact, and incentivesNext, I’d like to talk about your process for writing reviews and communicat-ing your findings to consumers.1.How frequently do you discuss security and privacy in your reviews, ifat all?USENIX Association32nd USENIX Security Symposium 2579
• [if not at all]:◦Do you ever consider discussing security and privacy?◦Why do you not include this?• [else]:◦How do you decide whether or not to discuss security andprivacy in a review?◦Do you ever make security- or privacy-related recommen-dations about how to use or configure a product?2.How often do you publish a negative review or recommend against aproduct, for any reason?3.Could you describe how you decide between publishing a negativereview and not publishing a review at all?4.Do security and privacy findings affect whether you recommend aproduct or not? Has security or privacy ever been the deciding factorin your recommendation?5.Are there other ways in which you aim to have an impact on consumers,aside from people directly reading your reviews?Conclusion1.Finally, is there anything else you’d like us to know about your workreviewing Internet-connected devices, apps, and other software, andabout reviewing security and privacy in particular?CPost-interview survey questionsSelf-identified level of expertise1.Please rate your agreement or disagreement with the following state-ment.I am extremely knowledgeable about all the steps needed to keep myonline data and accounts safe.•Strongly agree|Agree|Neutral|Disagree|Strongly disagreeSecurity and privacy knowledge questions1. If a website uses cookies, it means that the site...• Can see the content of all the files on the device you are using• Is not a risk to infect your device with a computer virus•Will automatically prompt you to update your web browsersoftware if it is out of date• Can track your visits and activity on the site• Not sure2.Which of the following is the largest source of revenue for most majorsocial media platforms?•Exclusive licensing deals with internet service providers andcellphone manufacturers•Allowing companies to purchase advertisements on their plat-forms• Hosting conferences for social media influencers• Providing consulting services to corporate clients• Not sure3. When a website has a privacy policy, it means that the site...•Has created a contract between itself and its users about how itwill use their data•Will not share its users’ personal information with third parties•Adheres to federal guidelines about deceptive advertising prac-tices•Does not retain any personally identifying information aboutits users• Not sure4.What does it mean when a website has “https://” at the beginning ofits URL, as opposed to “http://” without the “s”?• Information entered into the site is encrypted• The content on the site is safe for children• The site is only accessible to people in certain countries• The site has been verified as trustworthy• Not sure5. Where might someone encounter a phishing scam?• In an email• On social media• In a text message• On a website• All of the above• None of the above• Not sure6.Many web browsers offer a feature known as “private browsing” or“incognito mode.” If someone opens a webpage on their computer atwork using incognito mode, which of the following groups will NOTbe able to see their online activities?•The group that runs their company’s internal computer network• Their company’s internet service provider• A coworker who uses the same computer• The websites they visit while in private browsing mode• Not sureOrganization details1.How many people work on reviewing products at your primary organi-zation, to the best of your knowledge?• 1|2–3|4–9|10–19|20–29|30+|I don’t know2. Please provide any details we should know. ________3.How many views does a typical product review of yours receive, to thebest of your knowledge?•0–99|100-999|1,000-9,999|10,000–99,999|100,000–999,999|1,000,000+|I don’t know4. Please provide any details we should know. ________Participant demographics1. What is your age?•18–24|25–34|35–44|45–54|55–64|65+|Prefer not tostate2. What is your gender?• Male|Female|Other ________|Prefer not to state3. What is your race?•White|Hispanic or Latino|Black or African American|Ameri-can Indian or Alaska Native|Asian|Native Hawaiian or PacificIslander|Other ________|Prefer not to state4.What is the highest level of formal education that you have completedor are currently pursuing?•No high school degree|High school graduate, diploma or equiv-alent (for example, GED)|Trade, technical, or vocational train-ing|Associate’s degree|Bachelor’s degree|Graduate or pro-fessional degree|Other ________|Prefer not to state5. What country (or countries) do you work in?□United States□Other ________Supplementary materials containing Appendices D–F are located athttps://osf.io/m2pe7/?view_only=e6a8443956704fe2b380cfce1def1204.2580 32nd USENIX Security SymposiumUSENIX Association