{"id":7630,"date":"2025-11-06T13:12:14","date_gmt":"2025-11-06T13:12:14","guid":{"rendered":"https:\/\/vettio.com\/blog\/?p=7630"},"modified":"2025-11-07T07:10:59","modified_gmt":"2025-11-07T07:10:59","slug":"bias-in-hiring-algorithms","status":"publish","type":"post","link":"https:\/\/vettio.com\/blog\/bias-in-hiring-algorithms\/","title":{"rendered":"How Data Transparency Can Fix Hidden Bias in Hiring Algorithms"},"content":{"rendered":"\n<figure class=\"wp-block-image size-full\"><img loading=\"lazy\" decoding=\"async\" width=\"1000\" height=\"500\" src=\"https:\/\/snabup-prod.s3.amazonaws.com\/blog\/wp-content\/uploads\/2025\/11\/07064937\/How-Data-Transparency-Can-Fix-Hidden-Bias-in-Hiring-Algorithms.jpg\" alt=\"illustration of data transparency\" class=\"wp-image-7681\" srcset=\"https:\/\/snabup-prod.s3.amazonaws.com\/blog\/wp-content\/uploads\/2025\/11\/07064937\/How-Data-Transparency-Can-Fix-Hidden-Bias-in-Hiring-Algorithms.jpg 1000w, https:\/\/snabup-prod.s3.amazonaws.com\/blog\/wp-content\/uploads\/2025\/11\/07064937\/How-Data-Transparency-Can-Fix-Hidden-Bias-in-Hiring-Algorithms-300x150.jpg 300w, https:\/\/snabup-prod.s3.amazonaws.com\/blog\/wp-content\/uploads\/2025\/11\/07064937\/How-Data-Transparency-Can-Fix-Hidden-Bias-in-Hiring-Algorithms-768x384.jpg 768w\" sizes=\"auto, (max-width: 1000px) 100vw, 1000px\" \/><\/figure>\n\n\n\n<div class=\"wp-block-group is-nowrap is-layout-flex wp-container-core-group-is-layout-1 wp-block-group-is-layout-flex\">\n<p class=\"has-large-font-size\"><strong>TL;DR<\/strong><\/p>\n\n\n\n<ul class=\"wp-block-list\">\n<li>Bias in hiring algorithms can block good candidates.<\/li>\n\n\n\n<li>Sharing algorithm data builds trust and reveals unfair patterns.<\/li>\n\n\n\n<li>AI Talent Software helps reduce unconscious bias in hiring.<\/li>\n\n\n\n<li>Talent assessment tools can reveal gender bias in job descriptions.<\/li>\n\n\n\n<li>Learn what bias is, why transparency matters and how to fix it.<\/li>\n<\/ul>\n<\/div>\n\n\n\n<p>HR teams once received job applications directly and made all hiring decisions themselves. But today, many organisations rely on systems and algorithms which introduce a new challenge. Even when people intend fairness, hidden bias in hiring algorithms can sneak into decisions. This can exclude good candidates before a human even reads their file. The result is frustration for applicants and missed opportunities for companies.<\/p>\n\n\n\n<p>They can solve this issue by using data transparency. It helps organizations show how decisions are made, fix hidden unfairness and use inclusive strategies to build fairer hiring results. <font color=\"#000000\"><span style=\"background-color: rgba(0, 0, 0, 0.2);\">This blog <\/span><\/font><span style=\"box-sizing: border-box; margin: 0px; padding: 0px;\"><span style=\"background-color: rgba(0, 0, 0, 0.2);\">explains what hidden bias looks like, why transparency matters, and ho<\/span>w reducing unconscious bias in hiring works<\/span>.<\/p>\n\n\n\n<h2 class=\"wp-block-heading\"><strong>What Is Hidden Bias in Hiring Algorithms?<\/strong><\/h2>\n\n\n\n<figure class=\"wp-block-image size-full\"><img loading=\"lazy\" decoding=\"async\" width=\"1000\" height=\"500\" src=\"https:\/\/snabup-prod.s3.amazonaws.com\/blog\/wp-content\/uploads\/2025\/11\/05124711\/HBHA1.jpg\" alt=\"Hidden Bias in Hiring Algorithms\" class=\"wp-image-7635\" srcset=\"https:\/\/snabup-prod.s3.amazonaws.com\/blog\/wp-content\/uploads\/2025\/11\/05124711\/HBHA1.jpg 1000w, https:\/\/snabup-prod.s3.amazonaws.com\/blog\/wp-content\/uploads\/2025\/11\/05124711\/HBHA1-300x150.jpg 300w, https:\/\/snabup-prod.s3.amazonaws.com\/blog\/wp-content\/uploads\/2025\/11\/05124711\/HBHA1-768x384.jpg 768w\" sizes=\"auto, (max-width: 1000px) 100vw, 1000px\" \/><\/figure>\n\n\n\n<p>When companies use algorithms in their recruitment processes, they believe they are speeding up work and making fairer choices. Yet sometimes hidden traps lie beneath. For example, suppose a system is trained on years of past hiring data but those past hires favoured one demographic group. In that case, the algorithm may learn to favour the same group rather than recognising the best candidate.<\/p>\n\n\n\n<p>A 2024 University of Washington study found that some AI r\u00e9sum\u00e9-screening tools rated white-sounding names about <a href=\"https:\/\/www.washington.edu\/news\/2024\/10\/31\/ai-bias-resume-screening-race-gender\/\" target=\"_blank\" rel=\"noreferrer noopener nofollow\">85% of the time<\/a> and female-sounding names only 11%, exposing clear bias in automated hiring.<\/p>\n\n\n\n<p>Hidden bias may also show up in what the algorithm pays attention to: perhaps it favours graduates of elite colleges, penalises candidates who used non-traditional career paths or overlooks applicants from underrepresented backgrounds. That means the algorithm is silently reinforcing old patterns of exclusion even when no one intended it. And when the screening is fully automated (for example, in blind resume screening or digital scoring), it becomes difficult to recognise and correct the bias.<\/p>\n\n\n\n<h2 class=\"wp-block-heading\"><strong>Why Transparency Matters<\/strong><\/h2>\n\n\n\n<figure class=\"wp-block-image size-full\"><img loading=\"lazy\" decoding=\"async\" width=\"1000\" height=\"500\" src=\"https:\/\/snabup-prod.s3.amazonaws.com\/blog\/wp-content\/uploads\/2025\/11\/05124915\/HBHA2.jpg\" alt=\"fairness and transparency\" class=\"wp-image-7636\" srcset=\"https:\/\/snabup-prod.s3.amazonaws.com\/blog\/wp-content\/uploads\/2025\/11\/05124915\/HBHA2.jpg 1000w, https:\/\/snabup-prod.s3.amazonaws.com\/blog\/wp-content\/uploads\/2025\/11\/05124915\/HBHA2-300x150.jpg 300w, https:\/\/snabup-prod.s3.amazonaws.com\/blog\/wp-content\/uploads\/2025\/11\/05124915\/HBHA2-768x384.jpg 768w\" sizes=\"auto, (max-width: 1000px) 100vw, 1000px\" \/><\/figure>\n\n\n\n<p>Transparency is essential if we want to make fair hiring work. When algorithms make decisions but the process is hidden, candidates, recruiters and regulators remain in the dark. And without visibility, it is hard to spot flawed assumptions or unfair outcomes.<\/p>\n\n\n\n<p>When a company shares how it uses data and algorithms, it creates visibility into its operations. Showing how many candidates from each group were screened, shortlisted and hired helps reveal patterns. This makes it easier to spot <a href=\"https:\/\/vettio.com\/blog\/tips-to-avoid-gender-bias-in-job-descriptions\/\" target=\"_blank\" rel=\"noreferrer noopener\">gender bias in job descriptions<\/a> or the exclusion of people from non-traditional backgrounds. Good transparency builds accountability and helps move away from excuse-driven processes (\u201cthe algorithm just did it\u201d) to meaningful reviews.<\/p>\n\n\n\n<p>Moreover, transparency enables practical tools like effective diversity sourcing tips for talent assessment, effective diversity hiring strategies for talent assessment or boost gender diversity in talent assessments to do their job. If we don\u2019t know what our algorithm is doing, we can\u2019t improve it.<\/p>\n\n\n\n<p>Research also shows that even if an algorithm is well-designed, simply relying on it without showing how it works rarely improves diversity. A <a href=\"https:\/\/techxplore.com\/news\/2025-05-ai-wont-bias-workplace.html\" target=\"_blank\" rel=\"noreferrer noopener nofollow\">May 2025 study from the University of South Australia<\/a> found that diversity only enhanced when the tool could explain its decisions and was backed by organisational commitment.<\/p>\n\n\n\n<h2 class=\"wp-block-heading\"><strong>How Data Transparency Reduces Bias<\/strong><\/h2>\n\n\n\n<figure class=\"wp-block-image size-full\"><img loading=\"lazy\" decoding=\"async\" width=\"1000\" height=\"500\" src=\"https:\/\/snabup-prod.s3.amazonaws.com\/blog\/wp-content\/uploads\/2025\/11\/05124936\/HBHA3.jpg\" alt=\"transparent data reduces bias\" class=\"wp-image-7637\" srcset=\"https:\/\/snabup-prod.s3.amazonaws.com\/blog\/wp-content\/uploads\/2025\/11\/05124936\/HBHA3.jpg 1000w, https:\/\/snabup-prod.s3.amazonaws.com\/blog\/wp-content\/uploads\/2025\/11\/05124936\/HBHA3-300x150.jpg 300w, https:\/\/snabup-prod.s3.amazonaws.com\/blog\/wp-content\/uploads\/2025\/11\/05124936\/HBHA3-768x384.jpg 768w\" sizes=\"auto, (max-width: 1000px) 100vw, 1000px\" \/><\/figure>\n\n\n\n<p>Transparency works like switching on the lights in a dark room. When everyone can see what data an algorithm uses and how it makes decisions, it becomes easier to notice bias before it grows.<\/p>\n\n\n\n<p>An effective way to reduce bias in hiring algorithms is through data audits. Regular audits reveal patterns, such as whether certain keywords are unfairly weighted or whether specific demographics are underrepresented on shortlists. Publicly sharing these audits even in summary, builds trust with candidates and regulators while keeping teams accountable.<\/p>\n\n\n\n<p>Transparency also helps recruiters craft fairer interviews. When data shows who gets filtered out, talent teams can optimize DEI interview questions for better hiring or refine their screening prompts. Similarly, transparent algorithms help design better job ads. By understanding which phrases discourage applicants, recruiters can tackle gender bias in job descriptions and learn how to write job ads that attract diverse talent.<\/p>\n\n\n\n<p>Open data makes it easier to experiment with <a href=\"https:\/\/vettio.com\/blog\/blind-resume-screening-bias-evasion-strategy\/\" target=\"_blank\" rel=\"noreferrer noopener\">blind resume screening<\/a> and inclusive pipelines. Recruiters can test how outcomes change when personal identifiers are removed or when training data is adjusted to reflect more balanced talent pools. The result is smarter, fairer and more inclusive hiring decisions.<\/p>\n\n\n\n<p>Lastly, transparent systems create room for innovation. With insights into what works and what doesn\u2019t, companies can apply practical diversity sourcing tips for talent assessment and unlock diverse talent pools with AI hiring tools. These insights also improve effective diversity interview prep for talent teams, helping recruiters understand how subtle language or scoring factors influence bias.<\/p>\n\n\n\n<h2 class=\"wp-block-heading\"><strong>Conclusion<\/strong><\/h2>\n\n\n\n<p>The problem isn\u2019t that technology is unfair. It\u2019s that we sometimes forget to check its reflection. When organisations use AI to make hiring decisions without shining a light on their data, hidden bias thrives quietly. But transparency changes that.<\/p>\n\n\n\n<p>By exposing how algorithms work, companies can reduce <a href=\"https:\/\/vettio.com\/blog\/unconscious-bias-in-recruitment\/\" target=\"_blank\" rel=\"noreferrer noopener\">unconscious bias in hiring<\/a> with Vettio and eliminate it while ensuring every applicant has a fair chance. Fair hiring starts with simple questions like what data are we using and what story does it tell? The answer, once transparent, is where true diversity begins.<\/p>\n\n\n\n<!-- WordPress-ready FAQ Accordion | Colors: Black, Orange, White | Title is h3 | No white background -->\n<section class=\"vettio-faq\" aria-labelledby=\"vettio-faq-title\">\n  <h3 id=\"vettio-faq-title\">FAQs<\/h3>\n\n  <div class=\"faq-list\">\n    <details class=\"faq-item\">\n      <summary>\n        <span>Why did Amazon discontinue its AI recruiting tool due to bias concerns?<\/span>\n        <svg class=\"chev\" viewBox=\"0 0 24 24\" aria-hidden=\"true\"><path d=\"M6 9l6 6 6-6\"\/><\/svg>\n      <\/summary>\n      <div class=\"faq-answer\">\n        <p>Amazon ended its experimental AI hiring system after discovering that it systematically favoured male candidates. The algorithm had been trained on past resumes that reflected male-dominated hiring trends, teaching it to prioritise male-related patterns.<\/p>\n      <\/div>\n    <\/details>\n\n    <details class=\"faq-item\">\n      <summary>\n        <span>Are there laws in the US that regulate bias in hiring algorithms?<\/span>\n        <svg class=\"chev\" viewBox=\"0 0 24 24\" aria-hidden=\"true\"><path d=\"M6 9l6 6 6-6\"\/><\/svg>\n      <\/summary>\n      <div class=\"faq-answer\">\n        <p>Yes. Several states including New York and Illinois have introduced regulations requiring employers to audit or disclose the use of automated employment decision tools to ensure fairness and non-discrimination.<\/p>\n      <\/div>\n    <\/details>\n\n    <details class=\"faq-item\">\n      <summary>\n        <span>Is it possible for hiring algorithms to unintentionally favor male candidates over female candidates?<\/span>\n        <svg class=\"chev\" viewBox=\"0 0 24 24\" aria-hidden=\"true\"><path d=\"M6 9l6 6 6-6\"\/><\/svg>\n      <\/summary>\n      <div class=\"faq-answer\">\n        <p>Yes. When an algorithm learns from old hiring data that already reflects bias, it can start repeating those same patterns. In many male-heavy fields, this means men often get scored higher than women even when both have equal skills.<\/p>\n      <\/div>\n    <\/details>\n\n    <details class=\"faq-item\">\n      <summary>\n        <span>What are the risks of relying solely on AI for recruitment decisions?<\/span>\n        <svg class=\"chev\" viewBox=\"0 0 24 24\" aria-hidden=\"true\"><path d=\"M6 9l6 6 6-6\"\/><\/svg>\n      <\/summary>\n      <div class=\"faq-answer\">\n        <p>Overreliance can lead to uniform candidate profiles and amplify hidden biases, reducing diversity and innovation within the workforce.<\/p>\n      <\/div>\n    <\/details>\n\n    <details class=\"faq-item\">\n      <summary>\n        <span>How do biases in AI hiring tools affect workplace diversity?<\/span>\n        <svg class=\"chev\" viewBox=\"0 0 24 24\" aria-hidden=\"true\"><path d=\"M6 9l6 6 6-6\"\/><\/svg>\n      <\/summary>\n      <div class=\"faq-answer\">\n        <p>They limit representation by filtering out qualified candidates from minority, gender or educational backgrounds, shrinking the potential of diverse and inclusive teams.<\/p>\n      <\/div>\n    <\/details>\n\n    <details class=\"faq-item\">\n      <summary>\n        <span>What is intersectional bias in the context of hiring algorithms?<\/span>\n        <svg class=\"chev\" viewBox=\"0 0 24 24\" aria-hidden=\"true\"><path d=\"M6 9l6 6 6-6\"\/><\/svg>\n      <\/summary>\n      <div class=\"faq-answer\">\n        <p>It refers to overlapping discrimination in which an algorithm disadvantages candidates based on multiple identity factors such as gender and ethnicity simultaneously.<\/p>\n      <\/div>\n    <\/details>\n\n    <details class=\"faq-item\">\n      <summary>\n        <span>Can hiring algorithms penalize candidates based on the college they attended?<\/span>\n        <svg class=\"chev\" viewBox=\"0 0 24 24\" aria-hidden=\"true\"><path d=\"M6 9l6 6 6-6\"\/><\/svg>\n      <\/summary>\n      <div class=\"faq-answer\">\n        <p>Yes. When an algorithm places too much weight on specific schools or degrees it can end up ignoring capable people from smaller or lesser-known institutions. This often repeats old patterns of advantage and shuts out skilled candidates who deserve a fair look.<\/p>\n      <\/div>\n    <\/details>\n\n    <details class=\"faq-item\">\n      <summary>\n        <span>How do biased hiring algorithms affect candidates with disabilities?<\/span>\n        <svg class=\"chev\" viewBox=\"0 0 24 24\" aria-hidden=\"true\"><path d=\"M6 9l6 6 6-6\"\/><\/svg>\n      <\/summary>\n      <div class=\"faq-answer\">\n        <p>They may misinterpret gaps in employment or unconventional resume formats as negatives, overlooking candidates who bring valuable perspectives and resilience.<\/p>\n      <\/div>\n    <\/details>\n\n    <details class=\"faq-item\">\n      <summary>\n        <span>What is the role of transparency in addressing bias in hiring algorithms?<\/span>\n        <svg class=\"chev\" viewBox=\"0 0 24 24\" aria-hidden=\"true\"><path d=\"M6 9l6 6 6-6\"\/><\/svg>\n      <\/summary>\n      <div class=\"faq-answer\">\n        <p>Transparency allows everyone like recruiters, regulators and candidates to see how decisions are made. It encourages open discussion, fair audits and continuous improvement in data practices to build equitable hiring systems.<\/p>\n      <\/div>\n    <\/details>\n  <\/div>\n<\/section>\n\n<style>\n  .vettio-faq{\n    width:100%;\n    max-width:100%;\n    margin:0 auto;\n    padding:24px;\n    background:#0f1012;\n    color:#f8f9fb;\n    box-sizing:border-box;\n  }\n  .vettio-faq h3{\n    margin:0 0 16px 0;\n    font-size:1.25rem;\n    line-height:1.3;\n    color:#ffffff;\n    letter-spacing:.2px;\n  }\n  .faq-list{\n    display:block;\n    width:100%;\n  }\n  .faq-item{\n    border:1px solid rgba(255,255,255,0.08);\n    border-left:6px solid #ff7a00;\n    background:rgba(255,255,255,0.02);\n    border-radius:12px;\n    margin:10px 0;\n    overflow:hidden;\n  }\n  .faq-item[open]{\n    background:rgba(255,122,0,0.08);\n    border-color:rgba(255,255,255,0.12);\n    border-left-color:#ff7a00;\n  }\n  .faq-item summary{\n    cursor:pointer;\n    list-style:none;\n    display:flex;\n    align-items:center;\n    justify-content:space-between;\n    gap:16px;\n    padding:16px 18px;\n    font-weight:600;\n    color:#ffffff;\n    outline:none;\n  }\n  .faq-item summary::-webkit-details-marker{display:none;}\n  .faq-item summary:focus-visible{\n    box-shadow:0 0 0 3px rgba(255,122,0,0.6);\n    border-radius:10px;\n  }\n  .faq-item .chev{\n    width:20px;\n    height:20px;\n    flex:0 0 20px;\n    transform:rotate(0deg);\n    transition:transform .2s ease;\n    fill:none;\n    stroke:#ff7a00;\n    stroke-width:2;\n    stroke-linecap:round;\n    stroke-linejoin:round;\n  }\n  .faq-item[open] .chev{ transform:rotate(180deg); }\n\n  .faq-answer{\n    padding:0 18px 16px 18px;\n    color:#e8eaee;\n  }\n  .faq-answer p{\n    margin:0;\n    line-height:1.6;\n  }\n\n  \/* Orange highlight for key phrases if needed *\/\n  .vettio-faq em,\n  .vettio-faq strong{\n    color:#ffd4b0;\n    font-weight:600;\n  }\n\n  \/* Ensure links are readable on dark bg *\/\n  .vettio-faq a{\n    color:#ff9a3c;\n    text-decoration:none;\n  }\n  .vettio-faq a:hover{\n    text-decoration:underline;\n  }\n<\/style>\n\n<script>\n  \/\/ Optional: allow only one FAQ open at a time\n  (function(){\n    var items = document.querySelectorAll('.vettio-faq .faq-item');\n    items.forEach(function(d){\n      d.addEventListener('toggle', function(){\n        if(d.open){\n          items.forEach(function(other){\n            if(other !== d) other.removeAttribute('open');\n          });\n        }\n      });\n    });\n  })();\n<\/script>\n\n\n\n<p><\/p>\n\n\n\n<div class=\"wp-block-buttons text-center is-content-justification-center is-layout-flex wp-container-core-buttons-is-layout-1 wp-block-buttons-is-layout-flex\">\n<div class=\"wp-block-button\"><a class=\"wp-block-button__link has-text-align-center wp-element-button\" href=\"https:\/\/vettio.com\" target=\"_blank\" rel=\"noreferrer noopener\"><strong><strong><strong><strong><strong><strong><strong><strong><strong><strong><strong><strong><strong><strong><strong><strong><strong><strong><strong><strong><strong><strong><strong>Cut the Clutter. Hire Better.<\/strong><\/strong><\/strong><\/strong><\/strong><\/strong><\/strong><\/strong><\/strong><\/strong><\/strong><\/strong><\/strong><\/strong><\/strong><\/strong><\/strong><\/strong><\/strong><\/strong><\/strong><\/strong><\/strong><\/a><\/div>\n<\/div>\n","protected":false},"excerpt":{"rendered":"<p>Learn how transparency reduces bias in hiring algorithms and builds fair, diverse and inclusive workplaces with data-driven hiring decisions<\/p>\n","protected":false},"author":5,"featured_media":7681,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"_kad_blocks_custom_css":"","_kad_blocks_head_custom_js":"","_kad_blocks_body_custom_js":"","_kad_blocks_footer_custom_js":"","_kadence_starter_templates_imported_post":false,"footnotes":""},"categories":[26],"tags":[41],"class_list":["post-7630","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-data-driven-recruitment","tag-smarter-hiring"],"taxonomy_info":{"category":[{"value":26,"label":"Data-Driven Recruitment"}],"post_tag":[{"value":41,"label":"Smarter Hiring"}]},"featured_image_src_large":["https:\/\/snabup-prod.s3.amazonaws.com\/blog\/wp-content\/uploads\/2025\/11\/07064937\/How-Data-Transparency-Can-Fix-Hidden-Bias-in-Hiring-Algorithms.jpg",800,400,false],"author_info":{"display_name":"Bisma Naeem","author_link":"https:\/\/vettio.com\/blog\/author\/bisma-naeem\/"},"comment_info":7,"category_info":[{"term_id":26,"name":"Data-Driven Recruitment","slug":"data-driven-recruitment","term_group":0,"term_taxonomy_id":26,"taxonomy":"category","description":"","parent":83,"count":22,"filter":"raw","cat_ID":26,"category_count":22,"category_description":"","cat_name":"Data-Driven Recruitment","category_nicename":"data-driven-recruitment","category_parent":83}],"tag_info":[{"term_id":41,"name":"Smarter Hiring","slug":"smarter-hiring","term_group":0,"term_taxonomy_id":41,"taxonomy":"post_tag","description":"","parent":0,"count":54,"filter":"raw"}],"_links":{"self":[{"href":"https:\/\/vettio.com\/blog\/wp-json\/wp\/v2\/posts\/7630","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/vettio.com\/blog\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/vettio.com\/blog\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/vettio.com\/blog\/wp-json\/wp\/v2\/users\/5"}],"replies":[{"embeddable":true,"href":"https:\/\/vettio.com\/blog\/wp-json\/wp\/v2\/comments?post=7630"}],"version-history":[{"count":10,"href":"https:\/\/vettio.com\/blog\/wp-json\/wp\/v2\/posts\/7630\/revisions"}],"predecessor-version":[{"id":7687,"href":"https:\/\/vettio.com\/blog\/wp-json\/wp\/v2\/posts\/7630\/revisions\/7687"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/vettio.com\/blog\/wp-json\/wp\/v2\/media\/7681"}],"wp:attachment":[{"href":"https:\/\/vettio.com\/blog\/wp-json\/wp\/v2\/media?parent=7630"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/vettio.com\/blog\/wp-json\/wp\/v2\/categories?post=7630"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/vettio.com\/blog\/wp-json\/wp\/v2\/tags?post=7630"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}