Organizations have long used records of individuals’ pasts to make risk assessments, and increasingly they do so with algorithms—sets of rules, which, in their most sophisticated form, exist as computer code. At first pass, this may seem to eliminate the possibility that people with problematic pasts will be cut a break. Yet algorithms do not erase discretion, only relocate it. In this talk, I explore the differences in exception-making that follow. I draw on the case of tenant screening, a realm in which nearly all gatekeepers assess risk by consulting credit reports, criminal records, and eviction histories, but some do so with rules-based algorithms, while others employ more traditional methods of judgment. Interviews with landlords and property managers as well as executives at real estate and tenant screening companies reveal both surprising similarity and meaningful difference in exception-making between judgmental and algorithmic contexts. I discuss the implications for theories of algorithms and discretion, as well as for scholars interested in the increasing influence of personal records on life chances. I contextualize these findings within my broader work on the cultural understandings that shape and legitimate the corporate use of personal data in allocating socially important resources.