Skip to content

Agency guide

Education Marketing Agency Data: A Practical Guide for EdTech Agencies

Reviewed byJohn Thomas, Founder, SchoolIntellast reviewed May 2026

Education marketing agencies serving EdTech clients need more than a school list. They need defensible sources, region-specific account maps, role coverage that matches the client's buying committee, campaign wedges grounded in recent change, and citations a client can read. This guide explains what good agency data looks like, how agencies typically source it today, the limits of each option, and how SchoolIntel becomes the agency's source-of-truth for the international school vertical.

International schools worldwide

~14,000

Source: ISC Research market sizing (2024)

International school students

~6.9 million

Source: ISC Research market sizing (2024)

IB World Schools globally

~5,800

Source: IBO Find an IB School directory

BSO-accredited British schools overseas

~110

Source: GOV.UK — BSO inspection list

COBIS member schools

~470

Source: COBIS school search

Sources SchoolIntel reads per school

8+

Source: SchoolIntel source model

Featured schools

A representative slice of the market

Step 1 · Define the client buying committee

Curriculum + role taxonomy + check size

Establish whether the client sells site-by-site (head of digital learning, IB coordinator) or group-level (group CIO, group head of education). This decides everything downstream.

Agency discovery worksheet

Step 2 · Pick the region scope

Country / city / curriculum stripe

Agencies that try to cover the entire 14,000-school world fail. Pick 1–3 geographies and use SchoolIntel's market pages to build the universe.

Region scoping decision

Step 3 · Source-rank the universe

KHDA, IBO, BSO, COBIS, school sites

Tag each candidate school with which sources confirm its existence, curriculum, and accreditation. This is the data the client actually questions in QBRs.

Public regulator + accreditor sites

Step 4 · Layer the role map

Per-school buyer roles + verification status

A list without role coverage is a school list, not an account list. Match each school to the actual decision-makers; verify emails (SMTP + 90-day re-check).

Agency role taxonomy

Step 5 · Apply the signal layer

Hiring posts, leadership change, accreditation cycle

TES + TIE + KHDA + group press releases. Re-rank weekly so the campaign always works the right schools first.

Public hiring boards + regulator notifications

Step 6 · Build the campaign wedge

One signal + one role + one source citation

Every account in the campaign must have a why-now sentence the client can read. This is the difference between agency data and a spreadsheet.

Agency campaign brief

Deliverable · Market brief

10–20 page country / region brief

Curriculum splits, top groups, regulator cycles, key events. Pulled from KHDA, IBO, BSO, COBIS, and association calendars — every claim cited.

Agency strategy deliverable

Deliverable · Account map

30–80 named schools with cited reasons

Each row: school name, curriculum, group, KHDA / accreditation tier, current signal, and the source URL behind it. Reads like a research note, not a CSV.

Agency account-planning deliverable

Deliverable · Pre-event pack

GESS, BSME, COBIS, EARCOS attendees

Who's likely to attend, what their KHDA / accreditation status is, what to say in the booth conversation. Produced ~30 days before the event.

Event pages + association programmes

Deliverable · Post-event pack

Booth + session attendee follow-up

Re-rank attendees by signal density (new role, recent inspection drop, group expansion). Drives the 30-day follow-up campaign.

Event follow-up workflow

Deliverable · Role-mapped target list

Buying-role-first, school-second view

When a client only sells to IB coordinators or heads of digital learning, the deliverable is a list of those people across schools — not a list of schools.

Role-page taxonomy

Source · KHDA + DSIB (Dubai)

Annual inspection cycle + ratings

Public, regulator-grade. Agencies cite KHDA reports verbatim in client briefs because it pre-empts client objections about list quality.

KHDA portal

Source · IBO Find an IB School

PYP / MYP / DP authorization

Authoritative IB curriculum truth. The cleanest source for IB-specific agency campaigns and the only one IB coordinators actually trust.

IBO directory

Source · BSO + COBIS

British Schools Overseas + COBIS members

Agencies serving British-curriculum vendors anchor account lists here. BSO inspection PDFs are gold — they name strengths and required improvements at strand level.

GOV.UK + COBIS

Source · Hiring boards (TES, TIE, Search Associates)

Live role openings

A new head of school posting in March is a 6–9 month buying signal. The most underused agency input.

TES + TIE + Search Associates

What 'agency data' actually means for EdTech clients

Education marketing agencies that serve EdTech clients are not selling a CSV. They are selling a defensible reason for a school to be in their client's pipeline this quarter. That distinction shows up in every artifact the agency hands over — the market brief, the account map, the pre-event pack, the role-mapped target list. Each one only works if the underlying data answers four questions at once: which schools exist, what curriculum and accreditation they hold, who the actual buying committee is, and what changed recently that justifies the outreach.

The international school market makes that hard. Per ISC Research's global sizing, there are roughly 14,000 international schools serving 6.9 million students worldwide. About 5,800 are IB World Schools, around 470 are COBIS members, and roughly 110 hold British Schools Overseas (BSO) accreditation. Each of those bodies publishes its own canonical list. None of them are merged for you.

The agency's job is to merge them — and then to layer recent change on top so the deliverable reads like a research note, not a directory dump. That is the bar this guide sets, and the bar SchoolIntel is built to clear. For a worked example of the integrated workflow, see the school intelligence for EdTech agencies hub and the marketing to international schools guide.

Schools globally

~14,000

Source: ISC Research

IB World Schools

~5,800

Source: IBO directory

COBIS member schools

~470

Source: COBIS school search

The four questions every agency deliverable must answer

When a client reads an agency target list, these are the questions they ask within thirty seconds. If any one of them lacks a defensible answer, the deliverable feels thin:

  • Does this school exist as described? Cited from a regulator or accreditor — KHDA, ADEK, IBO, BSO, COBIS, or NEASC. Not from a directory aggregator.
  • Is the curriculum stripe correct? Cited from the curriculum body (IBO, Cambridge International) or accreditor.
  • Who is the actual buyer? Mapped to a role taxonomy — IB coordinator, head of digital learning, EAL/ELL lead, group CIO. See the head of digital learning role page for an example.
  • Why now? A signal — KHDA rating change, new principal, IB authorization window, group expansion announcement. The why-now is what justifies the campaign spend.

How agencies source school data today — and what each tool gets wrong

Most education marketing agencies stitch their school data together from four or five tools that were not built for international K–12. The result is a tech stack that produces fast first drafts of a target list — and slow, expensive corrections every time a client questions a row. Understanding the limits of each tool is the first step to building something better.

The five most common inputs are LinkedIn Sales Navigator, ZoomInfo or Apollo, ISC Research subscriptions, scraped or purchased school lists from vendors like EducationDataLists, and a layer of manual research — analysts on the agency side reading school websites, KHDA inspection reports, BSO inspection PDFs, and TES job postings. Each input has a specific failure mode.

Typical agency data stack — coverage of international K–12 needs

Approximate coverage of the four agency-deliverable needs (school identity, curriculum/accreditation truth, role coverage, signal timing) by tool. Higher is better. SchoolIntel is designed to fill the gaps every other tool leaves.

The honest limits of the five most common inputs

These are the failure modes SchoolIntel sees agencies hit most often when prepping for client QBRs:

  • LinkedIn Sales Navigator: great for live role and tenure data, weak for school-segment context. A search for 'IB coordinator UAE' returns hundreds of profiles, but cannot tell you which school is BSO-accredited, which sits inside GEMS, or which had a KHDA rating change last cycle.
  • ZoomInfo / Apollo: B2B contact databases that index corporates well and international K–12 poorly. Coverage of UAE/Qatar/SE Asia school staff is patchy, and 'industry' fields almost always say 'Education' with no curriculum or accreditation detail.
  • ISC Research: strong for global sizing and curriculum splits; agencies use it for the market-brief introduction. Weak for live signals and contact-level freshness — see the ISC Research alternative comparison for the workflow detail.
  • Scraped or purchased lists: vendors like EducationDataLists sell large flat files. Schools-on-the-list answer is fine; everything else (curriculum stripe, role accuracy, recency) is uncited and frequently wrong. Hard to defend in a client meeting.
  • Manual research: an analyst reading school websites, KHDA inspection PDFs, and TES postings will produce the best individual rows. The cost is linear in headcount and stops working the day that analyst leaves. Most agencies cap manual research at the 30–80 schools that anchor the campaign and accept that the rest of the universe is shallow.

Why agencies still ship — and where the work breaks

Agencies ship by combining the inputs above with senior-led judgement. The resulting target list is usually defensible at the top — the 30–80 anchor accounts that an experienced analyst hand-curated. It breaks at the long tail, where the agency relied on Sales Nav filters, scraped CSVs, or assumed a directory entry was current. The break shows up in three places:

  • Wrong curriculum: schools tagged 'British' that are actually MOE-with-British-add-ons, or 'IB' schools that only run DP at sixth form. The client's IB-curriculum vendor pays for outreach to the wrong rooms.
  • Stale roles: a head of digital learning who left 18 months ago is still the named contact. Reach rate looks fine; reply rate is awful; the client wonders why.
  • Missing why-now: the deliverable is correct on identity but silent on timing. Every campaign reads the same to recipients because there is no school-specific reason for the message to land in May rather than December.

Client-grade agency deliverables and the data they require

Agencies that retain EdTech clients past the first quarter ship a small, repeatable set of deliverables. The names vary — playbook, account map, pre-event pack, market brief — but the underlying shape is consistent. Each one is a story about a slice of the international school market, told with cited evidence, and ending in a list of schools the client should work next.

The four deliverables below are the ones SchoolIntel sees in roughly 80% of agency engagements. Each one has a specific data dependency that determines whether the deliverable is defensible or thin. Use the dependency column to audit the agency's current source mix before promising the deliverable to the client.

Market brief — the engagement opener

A 10–20 page country or region brief that sets up the rest of the engagement. Curriculum splits, top school groups, regulator cycles, the events that matter, the seasonality of buying. Read by the client's CEO and head of marketing — both unforgiving readers when claims are uncited.

  • Sourcing dependency: regulator + accreditor + association data — KHDA, ADEK, IBO, BSO, COBIS, Council of International Schools. Every paragraph traceable to a public URL.
  • Failure mode: agencies that build the brief from a single source without cross-checking the regulator. Any figure read once tends to circulate — and clients with direct school experience will spot it if it has drifted since the last refresh.
  • SchoolIntel input: the UAE, Dubai, and Qatar market pages are written exactly as agency briefs would be — every figure cited, every group named, every regulator window mapped.

Account map — the campaign's spine

30–80 named schools with cited reasons. Each row carries: school name, city, curriculum, group ownership, current accreditation / KHDA tier, recent signal, and the source URL behind the signal. This is the artifact reps actually work from, so quality compounds — every row matters.

  • Sourcing dependency: merged regulator + accreditor + group + hiring data per school. This is the hardest deliverable to ship from a generic B2B database alone.
  • Failure mode: spreadsheet-shaped accounts with no why-now column. Reps default to canned templates because there is no school-specific hook.
  • SchoolIntel input: every account in SchoolIntel carries a paragraph explaining why it's in the queue this week, with the source URL and date stamped on the signal. Agencies export this directly into the client deliverable.

Pre-event and post-event packs

Events anchor the international school year. GESS Dubai in autumn, BSME Annual Conference, COBIS Annual Conference, and EARCOS leadership conference each anchor a multi-week agency campaign. Pre-event packs predict who will attend; post-event packs re-rank attendees by signal density.

  • Sourcing dependency: association membership lists + sponsor/exhibitor pages + speaker rosters + association calendars + recent role changes for those organisations.
  • Failure mode: agencies that rebuild the attendee list from scratch every year. The reusable layer (school identity, curriculum, group) is exactly the layer SchoolIntel maintains — agencies should rebuild only the signal layer between events.
  • SchoolIntel input: event pages carry attendee-pattern data, sponsor history, and a freshness-stamped signal column. Pair with the role pages — see the IB coordinator role page — to route booth conversations to the right job titles.

Role-mapped target list

When a client only sells to a specific role — say, IB coordinators or heads of digital learning — the deliverable is a list of those people across the relevant schools, not a list of schools. This is where 80% of agency engagements actually end up, because most EdTech products have a single dominant champion.

  • Sourcing dependency: school staff pages + LinkedIn + hiring boards + email verification (SMTP, 90-day re-check). The accuracy bar is high — wrong roles tank reply rates.
  • Failure mode: scraped lists where 'IB coordinator' is inferred from a job title field that hasn't been updated since 2022. Agencies that sell role-mapped lists without freshness controls regress to the mean within two quarters.
  • SchoolIntel input: staff data is pre-mapped to a role taxonomy across EAL, ELL, IB coordinator, and head of digital learning — with SMTP-verified contact data inside the product.

Region-specific account lists agencies are actually asked for

EdTech clients almost never ask for 'a global list'. They ask for a list of schools in the region they can ship to without burning their CSMs. The five regions below cover the vast majority of agency engagements SchoolIntel sees, and each one has its own canonical sources, signal cycle, and event anchor.

Treat this section as the agency's region-coverage worksheet: for each region the client wants, confirm you have a defensible source mix, a known signal cadence, and at least one event anchor before you commit to a deliverable timeline.

Top agency-asked region

UAE (Dubai + Abu Dhabi)

Source: SchoolIntel agency engagements

Highest signal density

Dubai (KHDA cycle)

Source: KHDA inspection cadence

Largest country market

China (~800 international schools)

Source: ISC Research

UAE — the densest market and the agency entry point

The UAE international schools page covers ~700 schools across Dubai (~220), Abu Dhabi (~210), Sharjah, Ajman, Fujairah, Ras Al Khaimah, and Umm Al Quwain. KHDA regulates Dubai; ADEK regulates Abu Dhabi; SPEA covers Sharjah; the federal MOE covers the northern emirates. GESS Dubai is the must-cover regional event.

UK + British schools overseas

Anchor sources are BSO, COBIS, and Cambridge International. Event anchor is the COBIS Annual Conference. Agencies serving British-curriculum vendors should always anchor account maps in BSO inspection PDFs — they read like consultancy reports and give the campaign immediate credibility.

Asia-Pacific — China, Singapore, Hong Kong, EARCOS region

China alone holds ~800 international schools per ISC sizing; Singapore and Hong Kong add another ~150 between them. EARCOS is the regional association anchor for ~165 member schools across East and Southeast Asia. The leadership conference attracts heads of school, IB coordinators, and heads of digital learning in a single room — the highest-leverage event week in the calendar for English-medium agency campaigns into APAC.

Middle East — KSA, Qatar, Kuwait, Bahrain, Oman

Beyond the UAE, the next-most-asked country is Qatar — see the Qatar international schools page. KSA is rising fast on Vision 2030 spend; Kuwait, Bahrain, and Oman are smaller but high-LTV when the agency's client lands a single multi-school group. BSME (British Schools in the Middle East) is the regional event anchor for British-curriculum coverage.

Americas + Europe — long tail, group-led

Europe and the Americas combine for several thousand international schools, but most agency campaigns into these regions are group-led: Nord Anglia, Cognita, International Schools Partnership (ISP), Inspired Education. The agency's deliverable is rarely 'all 4,000 European schools'; it's '60 schools across these four groups, with the group-HQ buying signal mapped'.

Why source citations are the agency's only durable moat

The most-asked agency QBR question is some version of 'how do you know?'. Clients ask it about every account in the campaign: how do we know this school is IB? How do we know the head of digital learning is new? How do we know KHDA rated them Very Good last cycle, not Good? Agencies that answer with 'our analyst checked it' lose ground every quarter to agencies that answer with a URL and a date.

Citation discipline is also the only thing that lets an agency scale beyond senior-led judgement. With a citation, a junior analyst can produce a row that holds up under client scrutiny. Without one, every row needs a senior to vouch for it, and the engagement is capped by analyst headcount. Agencies that institutionalise citations — every account row carries a source URL, a date, and a signal type — grow margins as they grow accounts.

Citations also matter outside the QBR. Agencies running outbound for EdTech clients are accountable to FTC endorsement guides in the US, the ASA / CAP Code in the UK, and ICO direct-marketing guidance under UK GDPR. When a campaign claim is challenged — by a school, a parent body, or a regulator — the citation trail is the agency's only defence.

What a defensible citation looks like at row level

SchoolIntel attaches three things to every signal it surfaces, and we recommend agencies adopt the same shape for hand-built rows:

  • Source URL: the public page where the claim is verifiable. Regulator > accreditor > association > school site > directory aggregator. Avoid scraped-list citations entirely.
  • Detected-at date: when the claim was last verified. Stamps freshness directly on the row so reps can self-filter stale data.
  • Signal type: what kind of change this is — leadership change, KHDA rating change, IB authorization, group expansion, new curriculum. Lets the client filter the deliverable by campaign theme.

Build your agency data layer yourself, or use SchoolIntel

Everything described in this guide is technically buildable in-house. Regulators, accreditors, school sites, hiring boards, and association calendars are public. The honest question is whether the agency should spend the time. Most agencies that try to build their own international school data layer end up in one of two failure modes — they ship it, then watch it decay, or they build it well, then lose the engineer who maintains it.

Two paths:

Build it yourself

Realistic effort to assemble an international-schools data layer that holds up to client questioning across one major region:

  • Source inventory: 1–2 weeks to map ~12 sources (regulators, accreditors, associations, hiring boards, group sites), decide scrape vs API, document refresh cadence.
  • Normalisation: 2–4 weeks to dedupe a few hundred to a few thousand schools across spelling variants, multiple campuses, dual curriculum tags, and group naming. This is the largest hidden cost.
  • Role coverage: 2 weeks to scrape staff lists, infer titles to a buyer-role taxonomy, and verify emails (SMTP + 90-day re-check). Re-verification is a permanent operational cost.
  • Signal layer: ongoing. Weekly cron jobs against KHDA, ADEK, BSO, IBO, TES, TIE, group press releases, and association calendars. Engineering owns this in perpetuity.
  • Honest timeline: 1 FTE for ~10–14 weeks to build a single-region layer, then 0.5 FTE forever to maintain. Stops working the day that engineer leaves. Doubles for each additional region.

Use SchoolIntel

What an agency gets without building any of the above:

  • Same-day region coverage: filter by country, city, curriculum, group, and signal across UAE, Dubai, Qatar, and additional regions on the roadmap. Pull a sourced 30–80 account map in one session.
  • Live source consensus: every school carries a confidence score across the 8+ sources we read. Agencies see which schools we trust and why — and can cite our citations directly in client deliverables.
  • Role coverage built in: staff lists pre-mapped to a buying-role taxonomy. SMTP-verified contact data inside the product, with 90-day re-verification.
  • Weekly re-scored queue: we re-read sources weekly. The agency's account list reorders itself; the agency doesn't rebuild it between client check-ins.
  • Cited reasons per account: every recommended target carries a paragraph explaining why now — backed by a source URL, date, and signal type that the agency can paste straight into the client deliverable.
  • Event packs ready to ship: pre-built attendee-pattern intelligence for GESS Dubai, BSME, COBIS, and EARCOS. Pair with role pages for booth-conversation routing.

Frequently asked questions

Questions this page answers

What does an EdTech client actually want from an education marketing agency's data layer?

A defensible reason for each school to be in the campaign this quarter. That requires four things bound together: school identity (cited from a regulator or accreditor), curriculum and accreditation truth (cited from IBO, BSO, COBIS, NEASC, or Cambridge), the actual buying committee mapped to the client's product, and a recent change that justifies the timing. Most agency engagements stall when the data answers the first two but is silent on the last two. SchoolIntel's international school market intelligence hub walks through the full model.

How do agencies typically source international school data today?

Five inputs, in roughly this order: LinkedIn Sales Navigator for live role data; ZoomInfo or Apollo for bulk contact records; ISC Research for sizing and curriculum splits; scraped or purchased lists from vendors like EducationDataLists; and analyst-led manual research against school sites, regulator portals, and hiring boards. Each input has a different failure mode, covered in the agency-tech-stack section above.

Why does ZoomInfo or Apollo coverage of international schools feel thin?

Both platforms are excellent at indexing corporate B2B and patchy on international K–12. The 'industry' field collapses to 'Education', the 'company size' field is often wrong for school groups (a single record vs ~70 schools at GEMS), and curriculum and accreditation are not modelled at all. Agencies that lean on them inherit those gaps in every deliverable. Pair them with regulator-grade sources like KHDA or BSO — or use SchoolIntel as the merge layer.

When should an agency buy ISC Research vs use SchoolIntel?

Buy ISC Research when the agency needs global market sizing, curriculum-share charts, and country-level student counts for slide decks and proposals — that's where ISC's data is strong and where the brand carries weight. Use SchoolIntel for the rest of the engagement: live account maps, role coverage, signal timing, event packs, and the weekly re-ranked queue. The two are complementary inputs, not substitutes; the side-by-side workflow is laid out on the ISC Research alternative comparison page.

How does SchoolIntel actually verify school staff emails?

Three layers. First, domain-level discovery against the school's verified site. Second, SMTP-handshake verification — actively probing the mail server to confirm the address resolves before we mark it deliverable. Third, a 90-day re-verification cycle, so the agency's deliverable doesn't decay between client check-ins. Agencies that hand-build role-mapped lists usually skip layer three and regress to the mean within two quarters.

What sources should an agency cite for a UAE-focused EdTech campaign?

In priority order: KHDA portal and DSIB inspection reports for Dubai; ADEK for Abu Dhabi; IBO Find an IB School for IB curriculum truth; BSO inspection reports and COBIS member search for British curriculum; GEMS group and Taaleem for group ownership; TES Dubai jobs for hiring signals. Anchoring an account map in those URLs is what makes the deliverable feel like research, not a list.

How should an agency build a pre-event pack for GESS Dubai or BSME?

Three layers, ~30 days before the event. First, build the universe from the event's published exhibitor and speaker pages — see GESS Dubai and the BSME annual conference. Second, enrich each likely-attending school with curriculum, accreditation, and group ownership from the regulator/accreditor sources. Third, layer the most recent signal — KHDA rating, hiring post, IB authorization window — so booth conversations are specific. SchoolIntel's GESS event page and BSME event page follow exactly this shape.

What's the biggest mistake agencies make when handing off school data to an EdTech client?

Shipping a deliverable with no why-now column. Identity-and-curriculum data is the cheap part — agencies that stop there hand the client a directory, and the client's reps default to canned templates. The expensive part is the recent change that justifies the timing of the message. SchoolIntel's account view always carries a why-now paragraph per account, dated and source-cited, so reps can quote the signal verbatim in the first line of outreach. Agencies that adopt the same shape on hand-built rows close the gap. See the marketing to international schools guide for the complete pattern.

next step

Want this as a live ranked list?

SchoolIntel can turn this page into a sourced target market with account reasons, role coverage, and outreach angles your team can use this week.