Manchester has the lowest AI visibility of any city we've audited.
We ran 228 AI prompts across 4 platforms, testing 15 Manchester aesthetic clinics. The results: 1.9% average visibility. Perplexity returns 0%.
3 visible. 12 invisible.
Each dot represents one of the 15 Manchester aesthetic clinics in our audit. Only 3 appeared in any AI answer.
No dominant player. An open field.
Unlike Birmingham (where one clinic holds 81.3%), Manchester has no entrenched leader. The top clinic reaches just 20.8% — the lowest top-ranking rate of any city we have audited. The position is entirely claimable.
Visibility rate = percentage of relevant prompt runs in which the clinic was mentioned by an AI platform. Clinic names anonymised. Linksii audit, Manchester, April 2026.
A 20.8% top rate is the lowest of any city we've audited. There is no entrenched market leader in Manchester's AI layer. Any clinic that builds visibility systematically will not be climbing past a competitor with years of accumulated signals — they will be defining the field from scratch.
Perplexity returns nothing. Claude is near-zero. Gemini leads — at 3.9%.
Manchester has the worst platform-level visibility of any city in our audit. Perplexity at 0.0% means Manchester aesthetic clinics are completely absent from one of the UK's most-used AI search tools.
Average visibility rate — Manchester clinics across 15 prompt types
Average visibility rate — Manchester clinics across 15 prompt types
Average visibility rate — Manchester clinics across 15 prompt types
Zero mentions across all Manchester clinic prompts
Perplexity at 0% is fixable, and quickly. Perplexity responds primarily to indexed web citations — directories, review platforms, structured data. A clinic that builds structured directory presence can expect Perplexity movement within four to six weeks. Claude (0.6%) responds to editorial signals and takes longer. The fastest early gains will come from Perplexity and Gemini.
Manchester clinics nearly disappear at the consideration stage.
The consideration-stage collapse to 0.7% is the sharpest we have seen in any city audited. When patients ask AI to compare options, Manchester clinics become effectively invisible.
Why consideration collapses to 0.7%
Consideration prompts ask AI to compare clinics on specific dimensions: treatment expertise, practitioner experience, specialist skills. This requires AI to have treatment-specific content to draw on — detailed service pages, clinical credentials, procedure-specific reviews. Manchester clinics largely lack this depth of indexed content, so AI defaults to not naming them when patients are comparing options. This is the highest-priority gap to close: it is where patients are making their shortlist.
89% mentioned when asked. 1.9% recommended unprompted.
The gap is 87 percentage points. Manchester clinics exist in AI's knowledge — they just have almost none of the authority signals AI needs to recommend them. The 89% brand-direct rate tells us the problem is not awareness but authority. Closing that gap is a signal-building problem, not a brand-awareness problem.
Where AI gets its information about Manchester clinics.
568 total citations recorded across 228 prompt runs. Trustpilot carries unusual weight in Manchester relative to other cities in our audit.
Clinic-specific website domains aggregated. Linksii audit, Manchester, April 2026.
WhatClinic leads directories
69 citations — the dominant third-party source for Manchester, consistent with every other city in our audit.
Trustpilot unusually strong
62 combined Trustpilot citations — higher than in most other cities. Review credibility signals carry above-average weight for Greater Manchester AI queries.
Own site is the anchor
296 clinic website citations — building treatment-specific content that AI can reference is the highest-volume single action available.
The signals that matter for Manchester clinics
UK regulatory directories
- JCCP (Joint Council for Cosmetic Practitioners)
- Save Face — the government-approved register
- WhatClinic — 69 citations, the leading directory in our audit
- Doctify — high-weight medical authority signal
These are the UK authority sources AI models treat as trusted signals for clinic legitimacy.
Review and trust signals
- Trustpilot — 62 combined citations, above-average weight for Manchester queries
- Google Business Profile — fully completed, regularly updated
- Treatwell — 30 citations, booking-oriented discovery signal
- Third-party patient reviews on multiple platforms
Review credibility carries particularly high weight for Greater Manchester AI queries relative to other cities.
Manchester and regional press
- Manchester Evening News — highest-authority local publication
- Manchester Mill — independent, high engagement
- Cheshire Life — premium Cheshire and South Manchester audience
- North West regional lifestyle publications
Editorial mentions drive ChatGPT and Claude signals — the two platforms most in need of improvement for Manchester clinics.
Treatment-specific content
- Individual treatment pages with clinician-level specificity
- Practitioner credentials and qualifications detailed on-site
- Before/after content with treatment context
- Consistent NAP (name, address, phone) across all sources
The 0.7% consideration-stage rate is driven primarily by absence of treatment-specific indexed content. This is the priority fix.
Manchester is the lowest-visibility market we've audited. That means the most to gain.
Every data point on this page came from running Linksii — our proprietary AI visibility platform — against the Manchester market. We built Linksii because manual queries and commodity tools don't give the depth needed to act on AI visibility with confidence.
A clinic-specific Linksii audit tells you exactly where you sit: your visibility rate across all four platforms, your patient journey funnel, which sources are citing you, and what's missing. Given Manchester's near-zero baseline, early movers will have the entire field to themselves.
Four platforms tested
ChatGPT, Claude, Gemini, and Perplexity — including the platforms currently returning 0% for Manchester clinics.
Full patient journey
Awareness, consideration, and decision-stage prompts — so you can see why consideration collapses and what fixes it.
Citation source mapping
Which sources are driving your mentions — and which specific gaps explain the low baseline.
Prioritised action plan
A clear ordered list of what to address first, tied to your specific findings and the Manchester market context.
Further reading
AI visibility data: Belfast market audit
20 clinics, 287 prompt runs. 70% invisible. The first city-level Linksii audit we published.
Read moreAI visibility data: London market audit
20 clinics, 288 prompt runs. The most competitive UK market — and how it compares.
Read moreAI visibility data: Birmingham market audit
20 clinics, 288 prompt runs. 85% invisible. One clinic at 81.3%.
Read moreAI visibility data: Dublin market audit
20 clinics, 288 prompt runs. The most distributed visibility profile of any city audited.
Read moreGet your free AI visibility audit
See exactly how ChatGPT, Claude, Gemini, and Perplexity represent your clinic today.
Read moreHow Orbyt works
The ongoing retainer service — brand, AI visibility management, and monthly strategy.
Read moreCommon questions from Manchester clinics
What does the Orbyt audit show about Manchester clinics?
We ran 228 prompt queries across ChatGPT, Claude, Gemini, and Perplexity — testing 15 Manchester aesthetic clinics across 15 different patient prompt types. The results are striking: 80% of Manchester clinics are completely invisible across all four platforms. Only 3 clinics have any AI presence at all. The average visibility rate is just 1.9% — the lowest of any city in our audit. Perplexity returns 0.0% for Manchester clinics, and Claude is close at 0.6%. Manchester has the most acute AI visibility gap of any city we have audited.
Why is Manchester's AI visibility so much lower than other cities?
At 1.9% average visibility, Manchester is significantly below Belfast (6%), Dublin (6.5%), and Birmingham (5.9%). The likely explanation is a combination of lower directory citation density and weaker editorial coverage in sources AI models weight highly. Manchester clinics appear to have fewer structured citations on the third-party platforms AI draws on — WhatClinic, Doctify, and health-specific directories — relative to the number of clinics in the market. Perplexity's 0.0% rate is particularly telling: Perplexity relies heavily on indexed web sources, and Manchester aesthetic clinics have almost no presence in the sources it trusts.
Why is the consideration stage almost zero in Manchester?
Our patient journey analysis shows Manchester clinics average 2.9% visibility at awareness, 0.7% at consideration, and 2.8% at decision. The consideration-stage collapse to 0.7% is the sharpest we have seen in any city audited. Consideration prompts ask AI to compare clinics on specific treatment dimensions — lip filler expertise, practitioner experience, skin rejuvenation specialists. Without treatment-specific content and third-party validation, AI has nothing to draw on for comparison queries. Manchester clinics essentially disappear when patients start comparing options, then partially re-emerge at the decision stage via brand-direct signals.
What sources does AI rely on for Manchester clinic recommendations?
Our audit recorded 568 total citations across 228 prompt runs. Clinic websites (combined) drove 296 citations — the largest category. Trustpilot (62 combined), WhatClinic (69), Facebook (57), Instagram (54), and Treatwell (30) are the main third-party sources. WhatClinic leads among directories at 69 citations — a familiar pattern across all cities we have audited. For Manchester specifically, Trustpilot carries unusual weight relative to other cities, suggesting that review-based credibility signals are particularly influential for Greater Manchester queries.
What does Perplexity's 0% rate mean for Manchester clinics?
Perplexity returned 0.0% visibility across all Manchester clinic prompts in our audit — meaning it did not mention a single Manchester clinic in any relevant query response. Perplexity builds its answers primarily from indexed web sources and citation networks. A 0.0% rate indicates that Manchester aesthetic clinics have almost no presence in the web sources Perplexity trusts and indexes. This is fixable: Perplexity responds quickly to new citations from high-authority sources. A clinic that builds structured directory and review presence will likely see Perplexity movement faster than ChatGPT or Claude, which rely more on training data and editorial depth.
How competitive is Manchester for AI visibility?
Currently, competition is minimal. Only 3 out of 15 audited clinics have any AI presence, and the top clinic reaches just 20.8% — lower than the top-ranked clinics in Belfast (50%), Dublin (35.4%), or Birmingham (81.3%). There is no dominant player in Manchester. That makes the first-mover opportunity significant: any clinic that builds a structured AI visibility programme will be entering an effectively empty field at the AI layer. Given that Manchester is one of the UK's largest aesthetic markets by patient volume, the gap between market size and AI presence is unusually large.
How does Orbyt's Linksii platform audit Manchester clinics?
Linksii is a proprietary AI visibility monitoring platform built by Orbyt. It runs structured prompt batches across ChatGPT, Claude, Gemini, and Perplexity — testing visibility at awareness, consideration, and decision stages of the patient journey. For Manchester, we configure prompts to reflect how Greater Manchester patients search: both Manchester-specific and broader North West queries. We map citation sources, measure visibility rates per clinic and per platform, and identify the specific gaps driving poor performance. The Manchester audit on this page was generated entirely through Linksii.
How long before a Manchester clinic sees AI visibility improvements?
Given the extremely low baseline — 1.9% average, Perplexity at 0% — even modest early actions should produce visible movement relatively quickly. Perplexity and Gemini (3.9%) respond to directory and review signals within weeks of publication. ChatGPT and Claude take longer, responding more to editorial and training-data-weighted sources. A Manchester clinic starting from zero could realistically see Perplexity movement within four to six weeks, Gemini and ChatGPT movement within two to three months, and meaningful decision-stage presence within four months of an active programme.
Be the first Manchester clinic AI recommends
We run a full AI visibility audit using our Linksii platform — testing ChatGPT, Claude, Gemini, and Perplexity against Greater Manchester patient queries. No pitch, no strings. You keep the report.
Get Your Free Manchester Audit