It's well documented that unconscious bias can easily slip into hiring (and admissions, and tenure) decisions. At SciBase we're attempting to address this issue by leveraging the power of blinded, crowdsourced approaches to evaluating candidates.
When an applicant applies to a job through SciBase, if they haven't already done so we ask them to complete their profile, submit a couple reviews of paper's they've worked with before, and submit a few reviews.
Our community then goes to work. First, our system seeks to find people we believe could provide reviews of that applicant's papers and asks them to do so. Second, a team of community editors - selected because of their demonstrated expertise in the area - reads and rates the reviews of the applicant. Importantly, this process is done in a blinded fashion - the editors do not see their applicant's name or any other information information that might bias them: no gender, race, school, or even journal where the original paper was published. This data is then aggregated in a way that we believe surfaces the best scientists.
Though this process, the applicant is evaluated solely on the quality of their science and their ability to communicate that science clearly. We use that filter to identify the most promising candidates and then, and only then, do we start considering other factors like resume (from which the name is removed!).
Though this process (and more - we're just getting started), we hope to help de-bias hiring decisions and create a more diverse, inclusive scientific workforce.