Dijji Dijji Intelligence·Engagement·Defense
Help center

Everything you can do with Dijji.

A practical walkthrough of every feature — what it does, how to set it up, and what good looks like. No fluff. Pick a section from the sidebar or scroll start to finish.

01Getting started

Tracker installed → first data → ready in 5 minutes

Step 1 · Add a site or app

From your dashboard, click Add site / app. Give it a name + the domain (e.g. example.com). When you save, you get a unique site_key like ws_a647b7f83d…. That key identifies the site or app to the tracker / SDK.

Step 2 · Pick a vertical (recommended)

Dijji asks "what kind of site or app is this?" The picker is grouped into five categories:

  • Sell something — SaaS · E-commerce / D2C · Marketplace · Agency / Services
  • Engage an audience — Creator / Personal brand · Blog · News · Community
  • Deliver a service — Edtech / Courses · Fintech · Internal tool / B2B
  • Convert a campaign — Landing page · App marketing · Nonprofit / NGO
  • Other — Portfolio · Something else (skips the starter pack)

Picking one auto-seeds opinionated goals + funnels + triggers tuned to that vertical. Everything ships paused — the seeds appear in your dashboard but don't affect visitors until you flip the toggle. The vertical you pick also tells the AI brief the metrics that matter for your kind of business (a SaaS owner reads about trial-to-paid; a creator reads about YT-attributed sessions).

Step 3 · Install the tracker

Visit /install for platform-specific snippets. The web tracker is one line:

Web
<script async src="https://dijji.com/d.js" data-site="ws_…"></script>

Drop it in <head>. Or use the Auto-Inject tools on /install to install via WordPress / Shopify / GTM without touching code.

Apps: /install/android for Kotlin, /install/ios for Swift, /install/flutter for Flutter — all three are live. See section 10 below for the snippets.

Step 4 · See your first data

Once the tracker fires once, your Stats page will start populating. The AI Brief needs ~5 pageviews before it generates the first one. The 3D globe shows live pulses as visitors arrive.

Tip

No data showing up? Open ?__dijji_replay=1 on your own site to force a tracker test. Or check Site Settings → "Test replay right now."

02The stats dashboard

Stats → overview · Every visible card explained

The 3D globe

Cesium-rendered, real geography. Dots are visitors plotted by IP-resolved lat/lon. Modes:

  • Auto rotate — slow spin, ambient mode for a wallboard
  • Interactive — drag to rotate, pinch to zoom, hover to see city + page
  • Heat — swaps dots for blurry blobs, useful when you have hundreds of points
  • Play button — flies through the last 15 unique visits with a typewriter narration

Click any dot to open the live-users modal. Click LIVE pill to see who's on the site right now.

AI brief

4-6 sentence, GPT-4o-written summary of what happened yesterday. Refreshes once a day (cached); manual Regenerate button on the dashboard forces a new one. Compares using day-of-week when enough data exists (Monday vs last Monday) — weekends naturally run lower than weekdays so plain DoD is misleading. Flags anomalies, source concentration, and meaningful changes. The brief speaks the vocabulary of your vertical (set during site creation) — SaaS gets trial-to-paid framing, creators get YT/IG-attribution framing, e-com gets cart abandonment framing.

Anomaly banner

Eight metrics watched, day-of-week-aware. Yesterday's value compared against the same weekday from the prior 4 weeks (falls back to a 7-day rolling baseline when DoW samples are sparse). 4 severity tiers drive the banner colour:

  • CRITICAL — |z| ≥ 4σ. Red. Page someone.
  • HIGH — |z| 3–4σ. Orange. Red card.
  • WARNING — |z| 2–3σ. Yellow. Worth investigating.
  • INFO — |z| 1.5–2σ. Sky blue. FYI.

Metrics watched: pageviews, unique visitors, avg engagement seconds, JS error rate, bounce rate, top-source share (acquisition mix shift), top-country share (geo mix shift), and LCP p75 (performance regression). Each anomaly also carries a REPEAT chip if the same direction fired the day before, and a LOW CONF chip when the underlying series is too noisy (high σ/μ). Confident anomalies feed the AI brief's lede.

Ask Anything

Plain-English Q&A over your analytics. Type any question and Dijji compiles it into a safe parameterised query, runs it, and writes a 2-4 sentence answer. Examples:

  • "How many visitors from Mumbai last week?"
  • "Top sources last 7 days?"
  • "Bounce rate on /pricing this month?"
  • "Did pageviews go up vs the prior week?"

Dimensions available: page · country · city · source · device · time period (today / yesterday / last 24h / 7d / 30d / prior 7d / prior 30d). Not yet: cohort segments, funnels, custom event filters — coming soon.

KPI cards

4-up: Total views, Unique IPs, Last 24h, Last 7 days. The range toggle (30D / 90D / ALL) at the top of the Daily Views chart switches the window for all aggregate cards.

Top Pages

Ranked by pageviews in the selected window. Each row shows views, unique visitors, a 14-day sparkline, and last-seen timestamp. Click any row to drill into that page's heatmap, scroll-depth, top sources for that page, and recent visitors.

Performance · Web Vitals

Real-user p75 numbers from your actual visitors (not synthetic Lighthouse). LCP / INP / CLS / FCP / TTFB color-coded green / amber / red against Google's CrUX thresholds. Sample count shown — we mark a metric "good" only after at least 25 samples.

Engagement strip

Avg time, avg scroll depth, bounce rate, engaged %, rage clicks. A high rage-click number paired with a flat scroll suggests a broken interactive element — drill into the rage targets list (sidebar) to find the specific selector.

Top countries / cities / sources / search keywords

Country and city resolved from IP (ip-api). Source is referrer host, bucketed (Google, Twitter, HN, etc.). Search keywords come from referrer URL parameters where the search engine still leaks them (Bing / DuckDuckGo / Yandex — Google strips them).

Live feed

The most recent visitors, grouped by IP within a 30-min window. Click a row to expand the chronological page sequence. The SESSION pill on each row opens the AI-narrated session timeline for that visitor.

Errors

JS errors captured by the tracker, with severity classification: noise (cross-origin redacted, ResizeObserver, aborted fetches), minor, major (TypeError, ReferenceError). Only major errors enter the AI summaries — noise stays in the table for diagnostics.

AI Explain (per error)

Click any JS error row to expand a per-error AI explainer. "Explain this error" sends the message, source, line number, and stack to gpt-4o-mini grounded in your site description. Returns: a plain-English explanation, an impact rating (likely / possible / unlikely / unknown), where to look, and a copy-paste-ready message you can forward to a developer. Cached per error signature so repeat clicks are free.

02bContent performance

Chartbeat-style ranking · What readers actually finish, not just open

The Content tab in the per-site subnav shows which articles, posts, reports, or pages your visitors are actually reading — ranked by engagement quality, not pageviews. Pageviews tell you what got opened; this surface tells you what got read.

Auto-detection — no configuration needed

We figure out which paths on your site are "content" via two signals:

  • Path prefix whitelist/blog, /articles, /posts, /news, /insights, /resources, /stories, /case-studies, /whitepaper, /report(s), /p/, /category/, /tag/, plus year-prefixed paths like /2026/. If your CMS uses any of these your content is detected automatically.
  • Engagement fallback — pages outside the whitelist still surface if avg active time ≥ 25 s AND avg scroll ≥ 35% AND ≥ 3 distinct visitors. So a SaaS feature page that genuinely engages still appears even without a /blog prefix.
  • Always-excluded utility paths/cart, /checkout, /login, /register, /api/*, /admin, /auth, /privacy, /terms, /contact, /about, /pricing never qualify as content even with strong engagement.

Smart fallback: if we detect fewer than 3 content pieces (i.e. your site doesn't have a content section), the page auto-flips to "All pages" mode with a small banner explaining why. You can force "Content only" via the toggle if you want to see the empty state explicitly.

Reading the page

  • LIVE strip (only when readers are active right now) — pulsing green dot, "X readers across N pieces", top 3 currently-being-read links.
  • Rising (only when something's taking off ≥1.5×) — cards showing piece + velocity (e.g. 3.2× vs typical hourly rate). The Chartbeat killer view: "this is taking off RIGHT NOW."
  • Top content ranked by engagement quality (a blend of log(views) × (engaged_pct/100 + 0.5) — homepage doesn't dominate just because it's high-volume). Each row shows: title-ish (last URL slug, prettified), live-readers pill, "via Source" (Google / IG / Direct), "→ Recirculation" (where they go next), and a quality flag.

Quality flags — the real Chartbeat insight

  • Read-through — ≥60% scroll AND ≥30s. Genuine read.
  • Skim — decent scroll but short time. Glance, didn't engage.
  • Hit & bounce — views without scroll. Clickbait or wrong audience.

Pageviews alone tell you what got opened; these flags tell you what got read. A blog post with 10,000 pageviews and "Hit & bounce" is a worse outcome than 2,000 pageviews and "Read-through."

Range toggle + Mode toggle

Range: 7D / 30D / 90D / ALL. Mode: Content only (the smart filter) ↔ All pages (bypass the filter). Mode persists across range clicks via the URL — ?content=1 or ?all=1.

02cAudience comparison

Pie + click-to-drill · Geographic · Device · Source · Browser · OS

The Audience page (link in the bottom of Top Cities on /stats, or directly at /stats/audience) splits your traffic by a chosen dimension and overlays engagement quality on each slice. Answers questions like "Are visitors from Mumbai engaging differently from those in Bengaluru?" or "Do iPhone users bounce more than Mac users?"

Six dimensions to slice by

  • Country — geo-IP resolved per pageview.
  • City — same source.
  • Device — granular: iPhone, iPad, iPod Touch, Android Phone, Android Tablet, Mac, Windows PC, Linux PC, ChromeOS. Plus a distinct Mobile App bucket pulled from dijji_app_users (your installed iOS/Android SDK users — separate from mobile-web).
  • Source — referrer host bucketed (Organic / Paid / Social / Direct / Referral).
  • Browser — Chrome / Safari / Firefox / Edge / Opera / Other.
  • OS — Windows / macOS / iOS / Android / Linux / Other (rollup of the Device dimension).

Caveat for Device / Browser / OS: derived from web user-agents only. Mobile-SDK installs are tracked separately and surface only in the Device dimension's Mobile App bucket.

Reading the page

  • KPI strip — visitors / pageviews / avg time / avg scroll / bounce % / rage % across all slices in the range.
  • Pie chart — top 10 slices + an "Other" bucket so the chart stays readable. Center hole shows total visitors.
  • Slice detail panel — click any slice (or legend row) to drill in. Shows mini KPIs for that slice + sub-pies for devices and sources within the slice + top 5 pages within the slice.
  • Compare modeshift+click a second slice to compare side-by-side with delta arrows (green when better, red when worse — bounce/rage are inverse-coded so lower = better).
  • Sortable table at the bottom — every slice, every metric, click headers to sort. Engagement cells are color-coded green/amber/red against quality thresholds.

Range toggle

24H / 7D / 30D / 90D. The caption under the toggle shows "Showing X visitors from last Y days" so you can see the filter is being applied even when ranges return identical numbers (which happens on young sites where there's no older traffic to differ).

How metrics are calculated (so the numbers can be trusted)

  • Visitors — distinct COALESCE(visitor_id, viewer_ip). visitor_id is the localStorage-backed identity, IP is the fallback for older rows.
  • Avg time — true mean: SUM of active seconds across all pages / count of tracked visits. NOT AVG-of-AVG (which under-weights multi-page sessions).
  • Bounce % — visits with engagement under 5 seconds, divided by visits we have engagement data for. Visits with no engagement row are excluded entirely (not counted as bounces) — we don't have data, we don't pretend.
  • Rage % — visits with ≥ 3 rage clicks within 1.5 s of each other / tracked visits. Same denominator as bounce.

Common questions you can answer here

  • "Which city has the deepest engagement?" — sort by avg scroll desc.
  • "Are paid clicks engaging less than organic?" — Source dimension, compare Paid vs Organic.
  • "Is iPhone bouncing more than Mac?" — Device dimension, compare slices.
  • "Where do US visitors land vs UK visitors?" — pick US slice → look at top pages → click compare on UK.

03Triggers

When X happens → do Y · Popups · Banners · Pushes · Webhooks

First-time intro modal

The very first time you open the Triggers page (and on every browser that hasn't dismissed it), a modal explains what triggers do, shows four visual examples (popup / banner / toast / push), and lists every starter-pack trigger we pre-built for your vertical with a one-click Activate button. Everything we seed ships paused — you decide what fires.

Reopen the intro modal (clears the dismissal cookie; the modal will show next time you visit a site's Triggers page).

The AI Composer (recommended start)

At the top of Triggers: a textarea labeled "Describe what you want". Type plain English. Dijji compiles it into a structured trigger and renders the config for review:

  • "When a user spends 30 seconds on /pricing, show a popup offering a demo call"
  • "When the user has scrolled 80% down, show a corner toast asking if they want a free trial"
  • "Inactive 7+ days on the Pro plan — re-engage with what's new"

The composer picks the right trigger type, action type, copy, theme, and platform. You review, optionally edit the name, then save. ~3-5 seconds per compile.

Templates (alternative)

Below the composer: 50+ pre-built templates grouped into Behavior, Universal, Manual, and Mobile categories. Click one to fill in the blanks (title, body, CTA) and publish.

Trigger types (when does it fire?)

  • on_load — N seconds after page load
  • dwell — visitor stays on a path for N seconds
  • exit_intent — cursor leaves the viewport (desktop)
  • scroll — scrolls past N% of page
  • rage — 3+ rage clicks detected
  • path_match — on visiting a specific path
  • event — on a specific custom event firing
  • session_count — on Nth session
  • inactivity — visitor inactive for N days (mobile)
  • crash_recent — previous session crashed (mobile)
  • source_match — referrer matches a list (e.g. Twitter, HN)
  • form_abandon — typed N fields then went idle T seconds without submit
  • manual — admin pushes from the Live page

Action types (what happens?)

Web actions:

  • popup — modal overlay (5 variants: classic, dark-glow, split-image, hero, corner)
  • banner — top/bottom strip across the page
  • toast — small bottom-right card
  • countdown — live-ticking deadline timer
  • nps — 0-10 NPS prompt
  • reaction_bar — emoji reactions
  • video_popup — embedded YouTube/Vimeo
  • web_push — browser push (VAPID + aes128gcm, requires opt-in via SW)
  • webhook — POST to a URL of your choice

Mobile actions (Android, iOS, Flutter SDKs):

  • push — system push notification (FCM on Android, APNs on iOS)
  • in_app_banner — top/bottom strip inside the app, with optional 40×40 thumbnail
  • in_app_bottom_sheet — slide-up modal with optional 16:9 hero image
  • in_app_modal — centered card, full-bleed image option
  • in_app_hero — full-bleed takeover with 4:3 image + gradient overlay + primary/secondary CTA
  • in_app_nps — 0–10 colour-graded survey, fires __dijji_nps_submitted event
  • in_app_reactions — emoji feedback bar (configurable emoji set), fires __dijji_reaction_submitted
  • in_app_countdown — modal with live deadline timer (ISO-8601, relative +24 hours, or unix-seconds)
  • silent — silent push (data-only, no banner; used for state sync)
  • live_activity_update (iOS) — updates Live Activity / Dynamic Island via APNs

Targeting

Every trigger can be scoped: device (desktop/mobile/tablet/all) · country include-list · city include-list · time of day (visitor's local time) · day of week · first visit only / returning only.

Per-user rate limiting

The fire_once field caps a trigger to once per visitor ever. The newer rate_limit_days field caps it to once per visitor every N days — better for win-back and re-engagement campaigns. The server filters rate-limited rules out of /t/rules per visitor, so the tracker never even sees them.

Live page (manual pushes)

From Live, you see active visitors right now. Two manual actions per row:

  • SEND — push an existing trigger or compose ad-hoc copy. Reaches the visitor's tracker on the next inbox poll (within ~15s, or instant on tab focus).
  • RECORD SESSION — tells the visitor's tracker to upload a session replay right now. Mode-aware: lite or deep depending on site setting.

Preview

Every trigger row has two preview buttons: PREVIEW (staged page on Dijji, no customer site needed) and ON SITE (opens your own domain with a query param that fires once for inspection).

04Session replay

Lite (default) · Deep (rrweb) · Off · Capture-on-demand

Lite vs Deep — which to pick

In Site Settings → Recording mode you choose:

  • Lite (default) — mouse positions + clicks (with selector + visible text) + scroll + viewport + path changes. 2-8KB per session. Records every session by default. No third-party JS, no adblocker fragility.
  • Deep — full DOM via rrweb. 50-200KB per session (sometimes 1MB+ on first snapshot). Lets you replay the actual page exactly as the visitor saw it. Requires sample rate > 0% to record at all (default is 0% — triggers only).
  • Off — no recording.

Most sites should stay on Lite. Switch to Deep when you specifically need to see what was on the screen, not just where the cursor went.

When does a replay fire?

  • rage — 3+ rage clicks detected (Deep only — Lite records continuously)
  • error — JS error captured (Deep only)
  • cta_miss — visitor clicked near but missed a CTA (Deep)
  • sample — random sample (Deep, by sample rate %)
  • manualwindow.dijji.record() called from console
  • admin — captured from /live (works on both modes)

Reading the player

Lite player shows an animated cursor over an abstract canvas, click ripples with selector + text labels, scroll position on the right, page-change banners. Play/pause, 1×/2×/4× speed, scrub timeline, click list sidebar.

Deep (rrweb) player replays the actual DOM. Heavier load but visual.

Tip

Use Lite for "where did they click?" Use Deep for "what did the page look like when they got confused?" Most teams ship Lite by default and flip to Deep on a specific page (via the test URL) when investigating something.

05Track & Engage

Goals · Funnels · Journeys · Segments · UTM · Surveys · Pulse · Experiments

Goals

One-step conversions. Define a goal as either a page visit (e.g. path matches /thanks) or a custom event (e.g. signup_completed fired via window.dijji.track('signup_completed')). Match modes: exact / starts_with / contains / regex.

Per goal you see: total completions (in the selected window), prior-period delta, conversion rate (% of unique visitors).

Funnels

Ordered multi-step sequences. Each step is a goal-style match. Visitors enter when step 1 fires; they progress when subsequent steps fire in order within the funnel's window (default 30d). The funnel detail shows entry → step 2 → step 3 with drop-off % at each step. Source toggle lets a funnel run against web events (dijji_pageviews + dijji_custom_events) or app events (dijji_app_events). Recipe-card hero panels above the builder pre-fill common funnels in one click (Signup conversion, Cart abandon, Mobile onboarding).

Journeys

Multi-step automated campaigns. A journey defines an entry event (e.g. signup_completed) + an optional exit event (e.g. upgraded) + an ordered list of steps. Three step types: wait (N hours), send_push, send_in_app. The journey_runner cron sweeps every minute, advances each visitor through their state machine, queues sends with personalisation tokens auto-merged.

Each journey card shows the flow as colour-coded nodes (entry / wait / send / exit) with live counts. Pre-built recipes: Day-1 onboarding nudge, Cart abandon recovery, 7-day win-back. Click a recipe → form prefilled → save.

Segments

Saved targeting definitions. A segment is a JSON filter against dijji_app_users: country / city / platform / app_version / traits / sessions count gte+lte / last_seen_within or _inactive_days / installed_within_days / push_opted_in / not_uninstalled. Build via the Segments page; saved segments appear as targets in Push compose and (eventually) Trigger audiences.

Recipes: Indian free users · 7d active, Pro power users, Lapsed 14d, Fresh installs 7d. Refresh-size button shows current matching count without committing.

UTM builder

Build short campaign URLs with full UTM stamping. Each link gets a unique ?dl=<id> that overrides the default UTM grouping — fixes the classic problem of multiple campaigns colliding on ?utm_source=email.

Campaigns

From a UTM link's row, click into the campaign drill-in: KPIs for the campaign, 30-day timeline, top landing pages, sources within the campaign, UTM content variants (for A/B-style copy testing), top countries, recent hits, CSV export.

Surveys

Multi-step in-app surveys that fire on the visitor's mobile app or web tracker. Question types: rating (1–5 / 1–10), yes/no, single-choice (radio), multi-choice (checkbox), free text. Targeting: device / country / app_version / sessions count / inactive_days. Each survey has its own end screen + completion event. Per-question aggregates surface in the dashboard (histograms for rating/yesno/radio, choice counts for checkbox via JSON_TABLE, recent-100 list for text). Survey responses fire as __dijji_survey_* custom events so they show up in funnels / journeys.

Pulse

The audience-feedback feed. NPS scores, emoji reactions, and Ghost Bot chat sessions all surface here as a live stream alongside other site activity. Filter by Chats / Reactions / NPS. Inline expansion shows the full transcript or chosen emoji.

Experiments

A/B test pages, copy, or flows. An experiment defines a path-match (e.g. /pricing), assigns visitors to variants client-side via the tracker, and records exposure as a __dijji_experiment_exposed event with experiment + variant columns. Conversion attribution is automatic when the goal also has those columns set. Per-experiment view shows variants with sample size, conversion rate, and a confidence-interval overlap chart.

06Cyber defense

IP blocklist · Auto-flag heuristic · Dijji Fingerprint Engine

IP blocklist

Add any IP to the per-site blocklist. The tracker enforces it at every ingestion endpoint: blocked IPs silently get a 204 response and are dropped. They never enter your stats, never trigger rules, never appear in replays.

Auto-flag heuristic

Dijji watches for IPs that look like scrapers or bots:

  • >200 hits in 24h
  • ≤2 distinct pages with ≥80 hits (scraper signature)
  • ≥30 hits with bot user-agent strings

Auto-flagged IPs surface in a table on the Defense page with a one-click BLOCK button. They're not auto-blocked — you confirm.

Dijji Fingerprint Engine (DFE)

Client-side fingerprint scanner inspired by CreepJS. Captures ~25 browser signals (canvas / WebGL renderer / audio context / fonts / nav / timezone / screen / UA-vs-brands consistency / hardware) and produces:

  • A 32-char fp_hash for grouping (kin count: how many other visits share the same fingerprint)
  • A bot_score 0-100 based on automation indicators (headless UA, no plugins, webdriver flag, etc.)
  • A grade A-F, with breakdown of which specific signals tripped

The Defense page surfaces a hero card with totals, a list of flagged visits (score ≥30), and the most common signal-combo patterns. The session-detail view shows the full fingerprint with all 8 signal cards expanded.

Hand-off to your dev / security team

Two Create dev report buttons live on the Defense surfaces:

  • /defense — live incident summary. Aggregates open security signals + DFE high-risk visits (score ≥60, last 7d) + scraper auto-flag candidates + high-volume bot UAs into one printable doc, each finding with a per-type fix recipe.
  • /defense/audit — Hack Audit findings. Every non-pass check from the latest scan, grouped by severity, with the observation + fix recipe per item.

The report opens in a new tab. Top toolbar has Copy markdown (paste into Slack / Linear / email) and Print / Save PDF (browser dialog). @media print styles strip the toolbar and print clean. Pass-status checks excluded — the report is purely actionable items.

07Uptime monitoring

Synthetic HTTP checks · Public status page · 90-day history

Setting up a check

From Uptime → New check: enter a URL, method (GET/HEAD/POST), expected status code, optional body-contains string, timeout (default 10s), interval (default every minute). Dijji's cron runs the check, records up/down + response time, and the checker UA is Dijji-Uptime/1.0.

Per-check view

Each check has a 30-day incident timeline (every transition between up and down) and a 90-day daily status bar. Response-time chart underneath shows latency trend.

Public status page

Every site automatically gets a public status page at /status/<site_key> — overall up/amber/down + per-check 90-day bars. Shareable, no auth required.

08Visibility audit

SEO · AI discoverability · Performance · Accessibility · Security · Mobile

Running an audit

From Defend → Visibility: click Run audit now. Dijji fetches your home page + 3 sampled internal pages (from your top-page list, not random sitemap entries), parses the HTML server-side, and scores against ~50 checks across six categories.

Audits also auto-run weekly via cron (~04:00 IST). Score history shows in the dashboard once you have multiple runs.

The 6 categories

  • SEO — title length, meta description, canonical, robots.txt, sitemap, OG tags, h1 uniqueness, alt text coverage, internal linking
  • AI Discoverabilityllms.txt presence, schema.org JSON-LD (Article / Organization / FAQPage), AI crawler permissions in robots.txt, semantic HTML
  • Performance — compression, cache headers, render-blocking scripts, image format, lazy loading, AND real-user p75 Web Vitals from your tracker (not synthetic Lighthouse — actual visitors)
  • Accessibility — alt text, form labels, lang attribute, viewport meta, skip-link, button names, main landmark
  • Security — HTTPS, HSTS, CSP, clickjacking protection, mixed content, exposed .env / .git, JS error rate
  • Mobile — viewport configured, favicon, web app manifest, theme-color

Free vs Pro

Free shows the composite score (0-100, A-F grade), per-category breakdown, count of failing/warning checks, and the check NAMES. Pro unlocks: the specific URLs/elements failing, copy-pasteable fix snippets, doc links, score history sparkline, and (coming soon) competitor audit + weekly delta email.

Hand-off to your dev team

At the bottom of the Audit page, click Create dev report. Opens a printable doc with every non-pass finding grouped by severity, observation + fix recipe per item, plus Copy markdown and Print / Save PDF toolbar buttons. Same report shape as the Defense dev report — your dev team gets a consistent format whichever pillar surfaced the issue.

09Ghost Bot

AI chat widget · Per-site KB · Custom persona & theme

Enabling the bot

From Engage → Ghost Bot: toggle Enable. The bubble auto-mounts in the bottom-right corner of every page where the tracker fires.

Knowledge base

Each KB entry is a title + content pair. Three ways to add them:

  • Manual — paste text directly. Best for FAQs + custom answers you want phrased exactly.
  • File upload — drag a .txt / .md / .pdf. PDF extraction uses pdftotext when available, falls back gracefully.
  • URL crawler — paste your homepage; we BFS up to 30 same-host pages (depth 1 or 2), strip <script> / <style> / <nav> / <footer>, prefer <main> + <article> text, dedup by URL, respect robots.txt. Re-crawls replace prior crawler entries; manual + uploaded entries are kept.

All content is concatenated into the bot's system prompt at chat time (configurable char budget, default 8KB). Entries are NOT RAG'd individually — they're injected verbatim, so quality > quantity.

Persona

Set a name (e.g. "Vita", "Ada"), avatar (upload custom or pick from the platform library), greeting line, voice ("friendly", "concise", "playful"), and model (gpt-4o-mini default · gpt-4o · gpt-4.1-mini). The persona name + voice get woven into the system prompt.

Themes

7 widget themes shipped: Classic, Midnight, Emerald, Rose, Ocean, Slate, Sunset. Each is a CSS-variable bundle — pick one, the widget recolors instantly.

Transcripts

Every chat is saved to dijji_bot_transcripts. Reachable from Stats → Pulse (filter chip "Chats") or Stats → Live feed (chat sessions surface inline). Configurable email-on-end if a visitor finishes a chat — useful for sales handoff.

Public endpoints

The widget calls /bot/config and /bot/chat — both are public (auth comes from the opaque site_key + per-site enabled flag). Visitors never authenticate.

10Mobile SDK · all platforms

Web · Android · iOS · Flutter all live · React Native on the roadmap

Web — the one-line install

The first install. Drop into <head> on any page:

<script async src="https://dijji.com/d.js" data-site="ws_…"></script>

Captures pageviews, engagement, scroll depth, web vitals, errors, custom events, heatmap, session replay, and the fingerprint engine — all from one tag. SPA-aware (pushState / replaceState / hashchange). Full guide: /install.

Android — the 2-line install

From your app's Application.onCreate():

Dijji.init(this, siteKey = "ws_…")

Auto-captures app_open / app_background / screen_view / session_start/end / app_crash / app_install. Rich device context on every batch (W×H, density, dark mode, font scale, battery, memory, disk, timezone, network type, carrier, days_since_install, install source). Native crash via chained Thread.setDefaultUncaughtExceptionHandler. Three modules: dijji-core required, dijji-push + dijji-messages optional. Full guide: /install/android.

iOS — the 2-line install

From application(_:didFinishLaunchingWithOptions:):

Dijji.initialize(siteKey: "ws_…")

Same auto-capture surface as Android, plus pure-Swift crash capture via POSIX signal handler (catches fatalError, forced unwrap of nil, array OOB — the crashes NSException misses). Three modules: DijjiCore required, DijjiPush + DijjiMessages optional. Full guide: /install/ios.

Flutter — the 2-line install

One Dart entry-point, both platforms covered. From main.dart:

await Dijji.instance.initialize(siteKey: "ws_…");

Plugin SDK with Android (Kotlin) + iOS (Swift) native code for crash capture (JVM setDefaultUncaughtExceptionHandler, NSSetUncaughtExceptionHandler, POSIX signals) and Play Install Referrer. Pure-Dart for everything else — analytics, sessions, screen views (NavigatorObserver), all 7 in-app message formats (banner / bottom_sheet / modal / hero / nps / reactions / countdown) with image support. Persistent SharedPreferences-backed event queue survives force-quits.

Wire the navigator observer + key into MaterialApp:

MaterialApp(navigatorKey: Dijji.instance.navigatorKey, navigatorObservers: [Dijji.instance.navigatorObserver], …)

Critical contract: the SDK sends platform=android or platform=ios based on Platform.isAndroid/isIOS. SDK identification rides in X-Dijji-Sdk-Version: flutter-1.1.4-alpha. Full guide: /install/flutter.

Flutter push — the dijji_firebase companion

Push token registration is optional. The cleanest path: add the companion package and one line:

await DijjiFirebase.instance.attach();

Behind the scenes: requests notification permission, fetches the FCM/APNs token via firebase_messaging.getToken(), forwards to Dijji.instance.registerPushToken, subscribes to onTokenRefresh, fires push_received on every foreground message, fires push_opened on tap (including the cold-start getInitialMessage path), surfaces deep links from data.deep_link / data.url. dijji_firebase ships separately on pub.dev; the core dijji package never pulls Firebase as a dependency.

The mobile dashboard

From any site → Mobile. Tabs: Overview · Users · Events · Crashes · Installs · Push compose.

  • Overview — KPI strip + "By SDK" breakdown card showing Native iOS / Native Android / Flutter (iOS) / Flutter (Android) split, Day-N retention heatmap, push performance
  • Users — rich profiles with trait filters (set via setUserProperty()); each row carries an SDK-family pill so you can spot Flutter migrations against the native baseline
  • Events — live event stream with auto-refresh; SDK pill on every row
  • Crashes — grouped by type+reason with per-version rollup AND a "By category" rollup splitting crashes into Dart / Kotlin / Swift / Other, plus a category badge on every group; breadcrumbs + device state in detail view
  • Installs — Play Install Referrer attribution, UTM chips (Android + Flutter)
  • Push compose — write a notification, target by trait, live preview in a real notification card, "Test on my device"

Push setup

Android (FCM): upload your Firebase service-account JSON in Mobile → Settings. Encrypted at rest (AES-256-CBC). The push dispatcher cron polls every minute, sends via FCM HTTP v1 (RS256 JWT → OAuth2 token cached 55 min), revokes dead tokens automatically.

iOS (APNs): upload your Apple Auth Key (.p8) plus Key ID, Team ID, Bundle ID, and environment (production / sandbox) in Mobile → Settings. Same encryption scheme. Token-auth APNs over HTTP/2, JWT cached 50 min, dead-token detection on 410 / BadDeviceToken / Unregistered.

iOS Live Activities (v1.4-alpha)

The optional DijjiLiveActivity module bridges Apple's ActivityKit to our APNs dispatcher. Your app declares its own ActivityAttributes, calls Activity.request(), and one line — DijjiLiveActivity.observe(activity, activityId: "order_42") — auto-registers every push token Apple rotates to. From the dashboard, push state updates via action_type=live_activity_update. The OS updates the lock screen + Dynamic Island layout in real time.

Server side: dedicated dijji_live_activities registry, ApnsDispatcher.sendLiveActivity() with apns-push-type: liveactivity + the .push-type.liveactivity topic suffix, dispatched by the same cron sweep as regular push. Full integration walkthrough at /install/ios#live-activities.

10bWeb push notifications

VAPID + aes128gcm · One trigger engine · Three channels (FCM, APNs, Web)

Same trigger engine, three delivery channels — your visitor hits an exit-intent rule, we send via FCM if they're an Android user, APNs if iOS, or browser push if they opted in via the web. The web side uses the open RFC 8291 stack: VAPID identity + aes128gcm payload encryption + HKDF-derived AES-128-GCM keys.

Setup (one-time per site)

  1. Generate VAPID keys — Site settings → Web push → Generate VAPID keypair. Private key encrypted at rest, public key copied into the tracker config.
  2. Drop the service worker — Download dijji-sw.js from the same panel and host it at the root of your site (/dijji-sw.js). Browser security mandates same-origin SW; this is a one-line FTP / git commit.
  3. Wire the opt-in — Call window.dijji.requestWebPush() from a button click on your site. We register the SW, call pushManager.subscribe, ship the subscription to /t/web-push/subscribe.

Sending

Manual from the live page: pick a visitor, compose ad-hoc, action type web_push. From a trigger or journey: web_push_generic template under the mobile category. Same title / body / cta_url / image_url fields the in-app formats use.

Delivery + click are tracked via service-worker beacons (received_at, clicked_at on the push row), so the dashboard shows real device-side outcomes — not just push-service ack.

10cUser flows

Mixpanel-style Sankey · Top transitions per node · 5 hops deep

Open Stats → Flows. Type a starting page (or leave blank for top entry pages) and pick a hop depth. We expand the (prev_path, next_path) join layer by layer, keep the top transitions per source, collapse the long tail into (other) so the diagram stays readable.

Tactical use: "30 sec on /pricing — where do they go next?" The flow tells you the four most common branches and how many users took each. Compare flows on a feature page across releases to see if a redesign actually moved behaviour.

Self-hosted d3 v7 + d3-sankey from dijji.com — no third-party CDN dependency.

10dJS error grouping

Sentry-style fingerprinting · Release tagging · Users-affected count

Every JS error captured by the tracker is hashed at write time into a stable fingerprint — derived from the error type, normalised message, and the first three stack frames with line/col offsets stripped and cache-busted bundle hashes collapsed. Same bug, same fingerprint, even after a minified rebuild.

The Stats Errors banner now groups by fingerprint and shows: events (total count), users_affected (distinct visitor_id count), first_seen / last_seen, plus a release pill if your tracker config sets one. Clicking through filters the live feed to that fingerprint so you can watch a representative session replay.

Surface a release tag by adding data-release="v1.4.2" to the tracker script tag, or set window.__dijji_release before the tracker loads. Future builds get tagged automatically; regressions show up immediately.

11YouTube intelligence

Owner analytics · Competitor watchlist · Cross-platform attribution · AI niche reports

Connect a channel and Dijji becomes the only tool that ties YouTube views back to your website conversions. Tap YouTube in the per-site nav, "Connect with Google", grant scopes — that's it. No Cloud Console, no Data API key. Dijji backfills the most recent 500 uploads (or up to 5 000 with "Load full archive") and refreshes hourly.

The wedge — cross-platform attribution

Every Dijji-tracked site visit with utm_source=youtube and utm_content=<video_id> rolls up into yt_attribution. Per-video drill-in shows "Site visits from this video" as a first-class KPI. Pair with dijji_goals for funnel attribution end-to-end (YouTube view → site visit → goal completion). No competitor in the YT-analytics space stitches across the funnel boundary.

Owner-grade per-channel + per-video data

  • Channel — country breakdown (28d), device + OS, subscriber-vs-non-subscriber split, day-of-week heatmap from daily series
  • Video — search keywords (the owner-only goldmine), sharing-service breakdown, retention curve, sponsorship logging + printable one-pager report
  • Comments — gpt-4o-mini sentiment + intents + brand-mention extraction (~$0.0001/comment); 30-day sentiment trend, brand-mention cloud, top questions, brand-safety A–F grade
  • Media kit — public shareable page at /mediakit/yt/{token} with a print stylesheet for PDF export

Competitor watchlist

Public Data API only — no OAuth needed. Add up to ~50 channels. Per competitor: daily snapshot, last-20 video metadata, view-count history. Velocity score = first-24h views ÷ channel's 30-day median first-24h velocity (≥1.5 stamps "BREAKOUT"). Sponsor signature from URL extraction in descriptions plus AI text scan for verbal mentions ("Samsung Galaxy presents…"). Posting cadence heatmap (7×24 day-hour grid). Niche heatmap on tag + topic frequency × performance.

AI niche report (weekly)

Every Monday 09:30 IST, gpt-4o generates a 90–150 word niche-trend report for your watchlist with 2–4 verb-first action items. Surfaces as a card on /youtube.

AI brief integration

The daily brief email weaves a YouTube thread when relevant: yesterday + 7d aggregates, top performer, active breakouts, YT-attributed site sessions. Same NUMBER FIDELITY rule applies — every figure in the email appears verbatim in the data section.

11bInstagram intelligence

Owner analytics · Audience demographics · Competitor watchlist · IG → site attribution · DM automation

Connect an Instagram Business / Creator account and Dijji ties IG performance to your website conversions. Tap Instagram in the per-site nav, "Connect with Meta", grant scopes — done. Backfills account history and pulls hourly.

Owner analytics (per account)

  • 28-day insights strip — profile views, website clicks, email/phone clicks
  • Audience demographics — top countries, top cities, gender × age stacked bars, 24-hour follower-activity heatmap (gated behind Meta's 100-follower privacy floor)
  • 30-post grid — likes, comments, saves, reach per post

The wedge — IG → site attribution

Same pattern as YouTube. Stamp utm_source=instagram + utm_content=<post_id> on your link-in-bio + post URLs. The ig_attribution cron rebuilds a daily rollup of pageviews matching, and the per-account dashboard surfaces the IG → site sessions table. No other IG analytics tool stitches across the funnel boundary.

Competitor watchlist

Public Business Discovery API — no OAuth needed for competitors. Add up to 50 IG handles. Daily snapshots (followers, post count, engagement). Per-competitor: latest 20 posts, posting cadence heatmap, sponsor signature (extracted from descriptions via gpt-4o-mini brand mention pass).

DM automation (v1)

When someone DMs your IG account, Dijji auto-replies within seconds with your configured ack text and starts a 1-hour SLA timer. If no human replies natively on Instagram in 60 minutes, owners + admins get an email with the message + post-context + a "Reply on Instagram" link. Reply on IG and the timer auto-clears (we catch the message_echoes webhook).

Setup: Site → Instagram → [Account] → Direct Messages card. Reconnect once to grant the new instagram_manage_messages scope, paste your auto-reply (supports {{name}} / {{first_name}} / {{username}} tokens), flip the toggle. Live mode requires Meta App Review (1–4 weeks). Until approved, only the App's developer + test users can trigger the flow — enough to verify end-to-end.

11cCross-channel rollup

YouTube + Instagram side-by-side · Network-level view across every site you own

Open /app/cross-channel. Network-wide summary card showing total YT subs / IG followers / 30-day reach / 7-day attributed sessions. Below: a per-site card for every site in your active workspace, with YT and IG split panels showing channels, posts, attributed sessions, and the YT% vs IG% revenue-attribution split. The wedge made visible — you get to compare which platform is actually driving traffic to which site.

12Daily brief email

Morning summary · Per-user opt-in · One-click unsubscribe

Every day at 09:00 IST, members of any active workspace receive a per-site briefing email. Same content as the AI brief on the dashboard, sent via Resend (with Brevo fallback).

Toggle from Settings → Email Preferences. Each email has a one-click unsubscribe link that token-authenticates. Per-user × per-site × per-date dedup so members never get duplicate emails.

12bEnterprise · Data Connect

For teams of 50+ paid users · Plug in your customer DB · See who they are, not just what they clicked

The Enterprise tier closes the loop between Dijji's anonymous traffic and your actual customer data. Connect your production database (MySQL / Postgres / MSSQL) or a REST API (Stripe, your own events endpoint) and Dijji stitches signed-in users to the visitor cookies it already captures — one timeline from first visit to lifetime value.

What unlocks

  • LTV by acquisition channel — the wedge most analytics tools never close
  • Retention + churn risk — cohort heatmaps from your own DB, not GA estimates
  • Anonymous → known journey — pre-signup behaviour stitched to user identity (the IG post they saw, the YT video they watched, the docs page they bounced from)
  • AI schema mapping — an assistant chats through your tables, identifies which columns are users / traits / events / revenue. You confirm; the mapping is versioned and editable forever

Security posture

Read-only DB user enforced (we literally cannot write). Credentials encrypted at rest with per-tenant AES-256 keys. SSL required on every connection. IP allowlist — only Dijji's static IPs reach your DB. Full query log visible to you, revoke or pause anytime. SSH-tunnel option for non-public databases.

How to engage

See /enterprise for the value prop, dashboard previews, and the talk-to-sales form. Filling the form drops the lead into our sales + business intelligence inbox; we reply within 24h IST.

13Privacy & data

No cookies · Hashed IPs · Opaque keys · GDPR-friendly by design

No cookies. Every cookie law / consent banner regulation that targets cookies (GDPR, ePrivacy, DPDP) doesn't apply because we don't set any.

Two visitor identifiers, both anonymous:

  • visit_id — per-tab, stored in sessionStorage. Gone when the tab closes. Used to stitch a single browsing session.
  • visitor_id — persistent, stored in localStorage. Survives tab close and return visits. Used to recognise the same browser across days. Cleared by clearing site data in the visitor's browser, or by the visitor disabling localStorage.

Neither is a cookie. Neither leaves the visitor's device unless they engage with the site. Both are random opaque strings — we do not encode personal data into them.

IPs are hashed for fingerprinting and matching but stored in plaintext for geo lookup + abuse prevention. Configurable to fully hash if your jurisdiction requires.

Opaque site keys (ws_*) — never numeric IDs. Hard to enumerate even if an ownership check is missed.

The tracker masks all <input>, <textarea>, password fields, and any element with data-dijji-mask in session replays. Add the attribute to anything visually sensitive (PII forms, credit cards, etc.).

Privacy + Terms

See /privacy and /terms.

14Workspace & team

Multiple workspaces · Email invites · Per-account roles

Workspaces (accounts)

One Dijji login can belong to multiple workspaces. The header has an account switcher (purple dot) that shows all workspaces you're a member of. Active workspace determines which sites you see in the dashboard.

Invites

From Team → Invite member: enter email + role (owner / admin / member). The invitee gets an email; clicking the link signs them in (or up) and adds them to the workspace.

If they were invited under their existing email, signup auto-joins them to all pending invites — they don't have to click the link.

Roles

  • Owner — full control, can delete the workspace
  • Admin — can add/remove members, edit all sites
  • Member — can view + edit sites they have access to

Per-site access (scoped membership)

By default, members see every site in the workspace. For agencies and bigger teams that's wrong — a freelancer should only see the client they're working on. Switch a member to Specific sites in the Team page and pick the sites they get. Owners + admins always see everything regardless. Invitations carry the scope from the start (set during invite).

Bulk-assign mode lets you tick multiple members + multiple sites and apply with one click (Add or Replace mode). Useful when onboarding a whole team to a new client.

Team logins (admin-minted shared credentials)

Sometimes you need a shared login (a contractor, a kiosk, a sales-floor laptop) without sending an invitation. From Team → Create team login, set a username + display name + password + scope. No invitation email, no email-verification step required. Shows up in the member list with a TEAM pill. Admins can remove team logins they personally minted; everyone else has to be the workspace owner.

Cross-workspace dashboard

One Dijji account, multiple workspaces. The dashboard partitions sites: your active workspace's sites at the top, then "Also member of" cards for every shared workspace below. One-click switch via the form button on the card. Restricted-access members see a · scoped access indicator inside their shared-workspace cards.

Demo mode

Visit /demo to enter a fully-seeded read-only workspace with sample data across all features. All writes are blocked while in demo. Useful for prospects clicking around or for showing the tool internally without polluting your real data.

15Dashboard SDK indicators

FLUTTER · KOTLIN · SWIFT pills · Crash-by-category · Sites adoption pills

The mobile dashboard reads sdk_version on every event + user row and surfaces the SDK family inline. Useful for spotting Flutter migrations against your native baseline, or quickly seeing which SDK shipped a particular crash.

SDK pills

On /mobile/events, /mobile/users, and elsewhere, each row shows a small pill:

  • FLUTTER  cyan — sdk_version starts with flutter-
  • KOTLIN  green — sdk_version starts with dijji-android- (or platform=android, no flutter prefix)
  • SWIFT  purple — sdk_version starts with dijji-ios- (or platform=ios, no flutter prefix)

Mobile dashboard "By SDK" card

The mobile overview page shows a 4-cell breakdown card splitting your user base into Native iOS / Native Android / Flutter (iOS) / Flutter (Android). Single SQL aggregate over dijji_app_users — refreshes on every page load. Hidden when no users exist (avoids an empty card on a freshly-created site).

Crashes by category

The mobile crashes view (/mobile/crashes) classifies every crash into one of four buckets:

  • DartTypeError, StateError, RangeError, FormatException, NoSuchMethodError, FlutterError, etc.
  • Kotlin — anything starting java. / kotlin. or containing OutOfMemoryError
  • SwiftNSException, SIG* signal-derived names, Swift.Runtime
  • Other — falls back by platform when the type string isn't classifiable

Top of the page now shows a "By category" bar chart with proportional bars + counts. Each crash group also carries a category badge inline so you can scan the whole list and see what's hitting you.

Sites list adoption pills

On /app/sites, each site row shows compact 1-letter pills next to its name when adoption is non-zero — F for Flutter, K for Kotlin, S for Swift. Hover for the user count per family. Lets a workspace admin see SDK adoption across their entire site portfolio at a glance.

Helper functions

Three PHP helpers in dijji_helper.php drive everything above:

  • dijji_sdk_family($sdkVersion, $platform) — returns 'flutter' | 'kotlin' | 'swift' | 'unknown'
  • dijji_sdk_pill($sdkVersion, $platform) — returns the inline-styled HTML pill
  • dijji_crash_category($crashType, $platform) — returns 'Dart' | 'Kotlin' | 'Swift' | 'Other'

Drop-in for any new view that needs SDK awareness.

Need something we don't cover here? Email us Install guide How it works