ARC User Manual

Getting Started

Logging In

  1. Open ARC in your browser.
  2. Click Sign in with Google.
  3. Authenticate with your dejan.com.au Google account.

You will be redirected to the Properties list after login.

Navigation

  • Properties list — your home screen, showing all properties you have access to.
  • Property detail — click any property to open it. The detail view contains six tabs:
TabPurpose
AssociationsDiscover which brands AI models associate with your entities and queries
RelevanceMeasure how relevant AI models consider a brand to a query or entity
CitationsExtract and analyze citations from grounded AI responses
OptimizerIteratively optimize snippets for better ranking
ExportDownload data as CSV
SettingsManage entities, brands, queries, categories, tags, locations, models, and members

To log out, click Logout in the header.


Adding a New Property

A property represents a domain you want to analyze.

  1. From the Properties list, enter a Domain (e.g., example.com).
  2. Optionally enter a Display Name for easier identification.
  3. Click Add Property.

You are automatically added as the property owner. The property appears in your list with columns for Entities, Runs, Mentions, Citations, and Last Run.

To delete a property, click the Delete button next to it.


Setting Up Your Property

Before running probes, you need to populate your property with data. Go to the Settings tab.

Brands

Brands are the companies, products, or names you want to track in AI responses.

Add a single brand:

  1. Enter a Brand name.
  2. Select an ownership status: Owned, Competitor, or Unclassified.
  3. Click Add Brand.

Bulk add: Expand the Bulk Add section, paste brand names one per line, and submit.

Managing brands:

  • Use the Search bar to filter brands by name.
  • Filter by Category using the dropdown.
  • Sort by clicking column headers (E2B count, Q2B count, Avg Rank, etc.).
  • Use checkboxes to select multiple brands, then apply bulk actions: Set Owned, Set Competitor, Set Unclassified, or tag them.

Marking brands as Owned is important — relevance probes only run against owned brands, and results throughout the app highlight owned brands.

Entities

Entities are the topics, products, or concepts associated with your property (e.g., “running shoes”, “trail running”).

Add a single entity:

  1. Enter an Entity name.
  2. Optionally select a Category.
  3. Click Add Entity.

Bulk add: Paste entity names one per line.

Generate via AI: Click Generate via AI and ARC will visit your property’s URL with Gemini to automatically extract relevant entities.

Categories

Categories group entities and queries into hierarchical themes.

  1. Enter a Category name.
  2. Click Add Category.

Categories can be renamed inline and show entity and query counts. You can also bulk add categories one per line.

Queries

Queries are the search terms used to test brand visibility in AI responses.

Five ways to add queries:

  1. Queries List — view and manage existing queries with filters for Entity, Category, Tag, and Source.
  2. Manual Entry — enter a single query with optional Entity, Category, and Tags.
  3. Enter a List — paste queries one per line with optional Entity/Category/Tags applied to all.
  4. Import via CSV — upload a CSV file. ARC detects headers and lets you map columns to Query text, Entity, Category, or Tag. A preview shows the first 5 rows before import.
  5. Generate via AI — provide a URL (defaults to your property URL), set a count (1-500), and optionally add special instructions. ARC uses Gemini to generate relevant queries.

Tags

Tags are free-form labels you can apply to entities, brands, queries, and categories.

  1. Enter a Tag name.
  2. Click Add Tag.

Tags can be assigned inline from any item’s row across the Settings sub-tabs.

Locations

Locations add geographic context to probes (e.g., “Australia”, “United States”).

  1. Enter a Location name.
  2. Click Add Location.

When a location is selected during a probe run, it is appended to the system prompt to provide geographic context.

Citation Prompts

Citation prompts are custom prompts used specifically for citation mining.

  1. Enter a Prompt text.
  2. Click Add Citation Prompt.

You can also bulk add prompts one per line or auto-generate them from your entities.

Models

ARC supports multiple LLM providers and models (Gemini, GPT, Nova, Claude).

  • Use the checkboxes to activate or deactivate models for your property.
  • Only active models are used during probe runs.
  • Pricing information is shown for each model and feeds into the cost calculator on the Relevance tab.

Running Association Probes

The Associations tab discovers which brands AI models associate with your entities and queries.

Probe Types

TypeQuestion AskedRequires
E2B (Entity-to-Brand)“What brands associate with this entity?”Entities
B2E (Brand-to-Entity)“What entities associate with this brand?”Brands
Q2B (Query-to-Brand)“What brands are relevant for this query?”Queries
AllRuns E2B + B2E combinedEntities + Brands

Running a Probe

  1. Select a Probe type from the dropdown.
  2. Optionally select a Location.
  3. Click Run.
  4. Monitor progress in real time with per-model progress bars.

Reading Results

QAS (Query Association Score) — shown for Q2B probes:

  • Rank, Brand name, Score bar, Mention count, QAS percentage.
  • Owned brands are highlighted in bold.
  • Per-model breakdown shows how each model ranks brands.

E2B Aggregate — brand mentions across all entities:

  • Brand name, Score bar, Mention count, Share percentage.

B2E Aggregate — entity/item mentions per brand:

  • Brand name, Unique items count, Total mentions.

Top Queries for Owned Brands — the queries that most frequently surface your brands.

Use the Run filter dropdown to compare results across different runs.


Measuring Relevance

The Relevance tab measures how relevant AI models consider a brand-query or brand-entity pair, using repeated independent samples for statistical confidence.

Running a Relevance Probe

  1. Select a Source type: Queries or Entities.
  2. Select an Owned brand to test (only owned brands are available).
  3. Optionally select a Location.
  4. Set N per pair (1-100) — the number of independent samples per brand-source pair. Higher values give more statistical confidence.
  5. Review the cost estimate shown in real time (API calls and estimated cost).
  6. Click Run.

Reading Results

QRS (Query Relevance Score) — percentage of “yes” responses:

  • Brand, Relevance bar, Yes count, Total probes, QRS percentage.
  • Per-model breakdown.

Items Not Relevant — sources that scored 0%, indicating potential visibility gaps.


Citation Mining

The Citations tab extracts and analyzes citations from grounded AI responses to see which domains and URLs are being cited.

Setting Up

Before mining, add citation prompts via Settings > Citation Prompts or use your existing queries as source material.

Running Citation Mining

  1. Select a Source: Citation Prompts, Queries, or Both.
  2. Optionally select a Location.
  3. Click Run Citation Mining.

Reading Results

Summary cards at the top show:

  • Responses — total AI responses collected.
  • Citations — total citations extracted.
  • Unique Domains — distinct domains cited.
  • Search Queries — grounding queries used.

Detailed breakdowns:

  • Domain Breakdown — domain name, total count, provider-specific counts, owned status.
  • Brand Mentions in Responses — brand name, total mentions, owned status.
  • Top Cited Sources — URLs ranked by citation count with title, domain, and ownership.
  • Search Queries (collapsible) — the grounding queries used, with frequency.

Snippet Optimization

The Optimizer tab iteratively improves snippets to achieve better ranking in AI responses.

Starting a New Run

  1. Discover — enter a query and target URL or item.
  2. Resolve URLs — map items to their URLs.
  3. Start — select a model and set the target item.
  4. Monitor — watch live progress with a ranking chart updating every few seconds.

How It Works

ARC runs an iterative cycle for each attempt:

  1. Hypothesis — AI generates ideas for improving the snippet.
  2. Edit — AI produces a revised snippet based on the hypothesis.
  3. Rank — AI ranks all snippets including the new version.

After multiple attempts, ARC generates a storyteller analysis summarizing What Worked, What Didn’t Work, and Key Insights.

Viewing Results

The list view shows all previous runs with:

  • Query, Target URL, Baseline rank, Best rank, Status.
  • Click View Details for per-attempt analysis with a visual rank chart.

You can provide human feedback as constraints to guide the next round of optimization.


Exporting Data

The Export tab lets you download data as CSV files.

Available Exports

Associations:

  • QAS (Query Association Scores)
  • E2B (Entity-to-Brand associations)
  • Q2B (Query-to-Brand associations)
  • B2E (Brand-to-Entity associations)

Citations:

  • Citation domains
  • Citation URLs
  • Citation full data

Configuration:

  • Queries (with associated entities and categories)
  • Entities (with categories)
  • Brands (with counts and ownership status)

Click Export CSV for the dataset you need.


Sharing with Other Users

ARC uses a membership system to share properties with other users.

Adding Members

  1. Go to Settings > Members.
  2. Enter the user’s Email address (must be a dejan.com.au account).
  3. Select a Role:
RolePermissions
OwnerFull access. Can add/remove members, delete the property, and manage all data.
EditorCan manage data, run probes, and configure settings.
ViewerRead-only access to results and exports.
  1. Click Add Member.

Managing Members

  • The Members sub-tab shows all members with their Name, Email, and Role.
  • Owners can Remove members or change roles.
  • The property creator is automatically the first owner.

What Members See

All members with access to a property share the same data:

  • Entities, brands, queries, categories, tags, and locations.
  • All probe runs and their results.
  • Citation mining data.
  • Optimizer runs and analyses.
  • Exported datasets.

Sharing Workflow

  1. Create a property and set it up with brands, entities, and queries.
  2. Add team members via Settings > Members with appropriate roles.
  3. Run probes and analyses — results are immediately visible to all members.
  4. Export reports and share CSV files with stakeholders who don’t have ARC access.

Best Practices

  • Use descriptive Display Names for properties so members can identify them quickly.
  • Mark brand ownership accurately — it determines what appears in relevance probes and result highlighting.
  • Coordinate on entity and query naming to avoid duplicates.
  • Use tags to organize and filter large datasets across the team.
  • Use locations consistently so probe results are comparable across runs.