SEO + GEO · Technical SEO
Technical SEO · Core Web Vitals, crawl, indexation and rendering so your content actually ranks
Technical SEO is the invisible foundation that makes content rank: Core Web Vitals in Google’s 75th percentile, optimized crawl budget, SSR rendering solved, validated @graph schema, coherent hreflang and canonicals. Without that foundation, neither the best content nor the best link building moves the needle.
- Core Web Vitals in the green (LCP < 2.5s, INP < 200ms, CLS < 0.1) at Google’s real 75th percentile
- Optimized crawl budget: prioritized crawlable URLs, parameters and facets under control
- Time to indexation reduced to days, not weeks, on newly published pages
- 0 critical errors in Search Console: coverage, mobile usability and schema in the green
Technical SEO is the discipline that optimizes web infrastructure —performance (Core Web Vitals), crawlability, indexability, rendering, URL architecture, hreflang, canonicals, robots, sitemap and structured data— so search engines can discover, render, understand and index every page correctly. Unlike content SEO or link building, technical SEO is not about writing better or earning links: it’s about making sure the machine can read what you already have. It’s a precondition, not a substitute. A technically solid site is the one that ranks on Google and gets cited by ChatGPT, Perplexity, Gemini and AI Overviews.
What technical SEO covers
10 audit areas we work on every project
Complete technical SEO stack applied to B2B environments: WordPress, Next.js, Shopify, headless and complex ecommerce. Execution order: top first.
- 01
Crawlability
Bot access to every important URL:
robots.txt, meta robots,X-Robots-TagHTTP headers, blocks via authentication or firewall. A single misplacedDisallowwipes out entire sections. - 02
Indexation
Which URLs should (and which URLs should not) be in Google’s index. Coverage review in Search Console, correct canonicals, duplicates, thin content and zombie URLs purged.
- 03
Rendering
How Google (and the LLMs) processes your JavaScript content: SSR, SSG, ISR or CSR. We verify that the rendered HTML matches the HTML the bot sees, with no partial hydration or critical post-JS content.
- 04
Core Web Vitals
LCP, INP and CLS measured in the field (CrUX), not in the lab. Goal: 75th percentile in the green on mobile and desktop. Explicit performance budget per key page.
- 05
Schema · JSON-LD
Structured data as an enriched @graph: Organization, WebPage, Article, Service, FAQPage, BreadcrumbList, Product, Review. Validated with Rich Results and Schema.org validator.
- 06
Hreflang · international
Multi-language and multi-country sites: bidirectional
hreflang,x-default, sitemap with annotations and coherence with canonicals. A single mistake here sinks rankings in your main market. - 07
Canonicals
One canonical per URL, absolute, coherent with
hreflang, sitemap and internal links. No self-loops, no chains, no conflict with tracking parameters. - 08
Robots · sitemap
Clean
robots.txtwith explicit access for Googlebot, Bingbot, GPTBot, PerplexityBot, Google-Extended and ClaudeBot. Segmented XML sitemap with real lastmod and no 3xx/4xx URLs. - 09
URL architecture
Folder structure, slugs in the right language, faceted parameters, pagination, 301 redirects with no chains and no self-loops. Every URL with a clear indexation purpose.
- 10
SSR · CSR · hydration
Technical decision per template: what is SSR, what is SSG, what is CSR. Pre-JS vs post-JS HTML measurement so the bot sees the same content as the user.
Core Web Vitals
INP, LCP and CLS: the 3 metrics Google ranks on
Google has used Core Web Vitals as a real ranking signal since 2021. INP replaced FID in March 2024. The good threshold is measured at the 75th percentile of real users (CrUX field data), not in Lighthouse.
| Metric | What it measures | Good (p75) | Needs improvement | Poor |
|---|---|---|---|---|
| LCP · Largest Contentful Paint | Time to paint the main visible element (hero, featured image, H1) | < 2.5 s | 2.5 – 4.0 s | > 4.0 s |
| INP · Interaction to Next Paint | Latency of the user’s worst interaction with the page (click, tap, keyboard) | < 200 ms | 200 – 500 ms | > 500 ms |
| CLS · Cumulative Layout Shift | Visual stability: how much elements move during load | < 0.1 | 0.1 – 0.25 | > 0.25 |
| TTFB · Time to First Byte | Server time to the first byte (diagnostic, not a direct ranking signal) | < 800 ms | 800 – 1800 ms | > 1800 ms |
Levers that move CWV in production: preload the LCP image with fetchpriority="high", inline critical CSS, always declare width/height, defer non-critical JS, avoid heavy hydration in the hero, fonts with font-display: swap and preload, and watch third-party scripts (analytics, chat, ads) that trigger INP.
Crawl + indexation
Crawl budget, JS rendering, robots.txt and sitemap
On large sites (>10k URLs, ecommerce, media) crawl budget is a finite resource. On small sites what matters is correct rendering and canonical-sitemap-hreflang coherence.
- 01
Crawl budget and logs
Server log analysis to see which URLs Googlebot crawls, how often and what response it gets. Detects traps: infinite facets, calendars, tracking parameters, sessions in URLs.
- 02
JavaScript rendering
Googlebot processes JS but with delay (two-wave indexing). SSR or SSG solves the problem on the first pass. Pure CSR is only acceptable in private or non-indexable areas.
- 03
Robots.txt and AI bots
Explicit access for Googlebot, Bingbot and relevant AI crawlers: GPTBot, PerplexityBot, Google-Extended, ClaudeBot, OAI-SearchBot. Blocking by omission = disappearing from generative results.
- 04
Segmented XML sitemap
One sitemap per content type (posts, pages, products, categories), master index, real
lastmod, no 3xx/4xx/canonicalized URLs, registered in Search Console and Bing Webmaster Tools. - 05
Coverage in Search Console
Weekly review of the pages report: indexed, excluded, discovered not indexed, crawled not indexed, 5xx errors, redirect loops. Each anomaly with a technical ticket.
Web architecture
URLs, hreflang and canonicals coherent with each other
The URL + canonical + hreflang triangle must be consistent. An error in one of the three vertices breaks the other two and drags down entire rankings.
- 01
URL structure
Folder hierarchy that reflects the real business taxonomy. Slugs in the market language, no stop-words (the/a/an), lowercase, hyphen-separated, no numeric IDs or session parameters.
- 02
Bidirectional hreflang
Each language version declares all the others and itself.
x-defaultfor the fallback. Coherent with<link rel="canonical">and the sitemap. Validated with dedicated tools (Merkle, Sistrix). - 03
Canonicals with no loops
One single absolute canonical per URL. No chains, no self-loops, no conflict with
hreflangor with UTM/GCLID tracking parameters. Consistent with internal linking. - 04
Governed 301 redirects
3-tier policy:
.htaccessonly for infrastructure (HTTPS, domain), redirects plugin for slug migrations,functions.phponly for programmatic rules with documented reasoning.
Schema · JSON-LD
Structured data Google and LLMs use as ground truth
JSON-LD schema is not an extra: it’s the syntactic map engines use to understand without ambiguity. We deploy it as a @graph with all entities linked by @id.
- Organization
Organization
Company entity with
sameAsto LinkedIn, Crunchbase, GMB, Wikidata. WithaggregateRatingif there are verifiable reviews. - WebPage
WebPage + speakable
Correct type per template (WebPage, AboutPage, ContactPage, CollectionPage, ItemPage).
SpeakableSpecificationpointing to H1, TL;DR and FAQ. - Article
Article + Author Person
For blog and resources. With
authoras Person,datePublished,dateModified,wordCount,aboutlinked to Wikidata entities. - Service
Service / Product
For service pages or product sheets. With structured
provider,areaServed,serviceTypeandoffers. - FAQPage
FAQPage
Frequently asked questions marked up as
Question+acceptedAnswer. Answers of 40-80 words citable by LLMs. - Breadcrumb
BreadcrumbList
Navigational path with
itemListElementpositioned 1-N. Reinforces architecture and appears in the SERP as a visual breadcrumb.
cronuts rule: schema is deployed via ACF (schema_jsonld field) or CMS equivalent, never hardcoded in the theme. That way each page can adjust its graph without touching code.
cronuts methodology
5 phases to audit and resolve technical SEO debt
Proven sequence on B2B accounts in professional services, SaaS and ecommerce. Each phase has an auditable deliverable.
- 01
Full technical audit
Crawl with Screaming Frog, log analysis, Lighthouse CI, Search Console, CrUX and rendering test. Deliverable: prioritized report with impact and effort per ticket.
- 02
Core Web Vitals and performance
Intervention on LCP, INP and CLS with explicit budget per template. Preload, critical CSS, defer third-party JS, image and font optimization. Real measurement in CrUX, not just Lighthouse.
- 03
Crawl, indexation and rendering
Cleanup of
robots.txt, segmented sitemap, coherent canonicals, validated hreflang, governed 3-tier redirects and SSR verification on critical templates. - 04
Schema @graph and GEO
Enriched JSON-LD deployment on key templates: Organization, WebPage, Article, Service, FAQPage, BreadcrumbList and
speakable. Validated with Rich Results and Schema.org validator. - 05
Continuous monitoring
Alerts on Search Console coverage, weekly CrUX, monthly crawl diff and schema regressions. Dashboard with technical KPIs the business actually understands.
Technology stack
Tools we use to audit and ship technical SEO
- Crawl
Screaming Frog SEO Spider
Full site crawl: headers, status codes, canonicals, hreflang, schema, images, redirects, JS rendering with headless Chromium.
- Performance
Lighthouse CI + PageSpeed Insights
Performance metrics in lab and field (CrUX). Integrated into the deploy pipeline so we don’t regress on Core Web Vitals.
- Indexation
Google Search Console
Coverage, SERP performance, field Core Web Vitals, schema errors, sitemap, security, manual actions. Primary source of truth.
- Schema
Rich Results Test + Schema.org validator
JSON-LD validation before deploy. Catches errors and warnings that break eligibility for rich snippets and AI Overviews.
- Logs
Log file analyzer
Server log analysis to understand which URLs Googlebot crawls, how often and with what response. Critical on large sites.
- Monitoring
Notion · Looker dashboard
Monthly deliverable with KPIs: CWV p75, coverage, indexed/excluded pages, valid schema, crawl errors, broken redirects.
Does your site meet Core Web Vitals on Google?
If the answer is “I don’t know,” it doesn’t. We measure it for you in an initial technical audit and hand you the 90-day plan to go green on LCP, INP and CLS, with coherent crawl, indexation and schema.