This model's advantage was its simplicity
Posted: Sun Jan 12, 2025 5:49 am
Thank you so much for watching, and we'll see you next week for another edition of Whiteboard Friday. Page Authority 2.0: An Update on Testing and Timing Moz Tools | SEO Reporting | Moz News One of the most difficult decisions to make in any field is to consciously choose to miss a deadline. Over the last several months, a team of some of the brightest engineers, data scientists, project managers, editors, and marketers have worked towards a release date of the new Page Authority (PA) on September 30, 2020.
The new model is exceptional in nearly every way to the current PA, but our overseas chinese database last quality control measure revealed an anomaly that we could not ignore. As a result, we’ve made the tough decision to delay the launch of Page Authority 2.0. So, let me take a moment to retrace our steps as to how we got here, where that leaves us, and how we intend to proceed. Seeing an old problem with fresh eyes Historically, Moz has used the same method over and over again to build a Page Authority model (as well as Domain Authority).
, but it left much to be desired. Previous Page Authority models trained against SERPs, trying to predict whether one URL would rank over another, based on a set of link metrics calculated from the Link Explorer backlink index. A key issue with this type of model was that it couldn’t meaningfully address the maximum strength of a particular set of link metrics. For example, imagine the most powerful URLs on the Internet in terms of links: the homepages of Google, Youtube, Facebook, or the share URLs of followed social network buttons.
The new model is exceptional in nearly every way to the current PA, but our overseas chinese database last quality control measure revealed an anomaly that we could not ignore. As a result, we’ve made the tough decision to delay the launch of Page Authority 2.0. So, let me take a moment to retrace our steps as to how we got here, where that leaves us, and how we intend to proceed. Seeing an old problem with fresh eyes Historically, Moz has used the same method over and over again to build a Page Authority model (as well as Domain Authority).
, but it left much to be desired. Previous Page Authority models trained against SERPs, trying to predict whether one URL would rank over another, based on a set of link metrics calculated from the Link Explorer backlink index. A key issue with this type of model was that it couldn’t meaningfully address the maximum strength of a particular set of link metrics. For example, imagine the most powerful URLs on the Internet in terms of links: the homepages of Google, Youtube, Facebook, or the share URLs of followed social network buttons.