1. Help center
  2. Settings & Security

See the Full Picture: What MineOS Overviews Show

MineOS dashboards give privacy, security, and risk teams a single place to track progress, spot issues, and report outcomes across DSR, Compliance/RoPA, Vendor (TPRM) assessments, AI Governance, and Risk Management.

What every dashboard includes (shared features)

You'll find them in the top bar with filters on the left and three dot options dropdown for further actions 

  • Filters – refine by time window and module-specific fields (e.g., request type, country, assessment type etc)

  • Download – export the entire dashboard (PDF or CSV). Options include “expand tables to all rows” and “arrange tiles in one column” for easier printing.

  • Schedule delivery – email the dashboard on a cadence (daily/weekly/monthly at a set time) in PDF/CSV to internal/external recipients.

  • Per-widget actions – (where available) export a single chart/table if you only need one visualization. 

  • Widget alerts – (where available) define a condition for an alert to be sent via email. For example if tickets due in 7 days is larger than 0 OR if new high risk send email 

 

There are 5 dashboards available in the Portal depending on your available modules: 

1. DSR 

2. Compliance 

3. TPRM / vendor management

4. AI governance 

5. Risk management 

1) Data Subject Requests (DSR) Dashboard

Filters shown: Created At, Request Type, Country, State, System Request ID

Why it matters
You instantly see workload, SLA risks, automation efficiency, and regional demand.

 

Widget What it shows How it’s calculated
Open Requests Active requests not yet closed Count of requests with state ≠ Closed/Rejected
Overdue Requests Requests past their due date Count where due_date < today and state ≠ Closed/Rejected
Due in the next 7 days Upcoming deadlines Count where due_date ∈ [today, +7d] and state ≠ Closed/Rejected
Completed Requests Finished requests Count with state = Closed/Completed
Rejected Requests Requests declined Count with state = Rejected
Total Incoming Requests Volume of requests created in the filter window Count grouped by created_at in range
Incoming Requests Over Time Trend of new requests Time series of counts by week/month
Average Closing Time per Right Efficiency by right type For completed items: average(closed_at – created_at) grouped by right
Saved Time (β) Estimated time saved by automation Vendor-configured estimate based on automated steps executed
Requests Closed with Autopilot Autopilot adoption Count of closes where Autopilot handled closing flow
Total/Open/Closed Requests by Type Distribution by right Donuts of counts grouped by request_type
Total/Open/Closed Requests by Country Geographic distribution Donuts of counts grouped by country
Requests details (table) Row-level data (latest slice) Tabular results respecting filters

 

2) Compliance & RoPA (Processing Activities + Assessments)

Filters shown: Type & System

Why it matters
Shows where compliance work is stuck, whether assessments are surfacing risk, and how quickly they’re being closed.

Widget What it shows How it’s calculated
Draft Assessments Total in-progress assessments Count with status = Draft
Drafts by assessment type Which templates are stuck in draft Bar chart of Draft counts grouped by assessment type (e.g., DPIA, TIA, PIA, LIA)
Draft vs. completed PA (RoPA) Completion of processing activities Donut of Processing Activities by status (Draft vs Completed)
Recently updated PA assessments (table) Freshly touched PA items Table of PAs ordered by updated_at
Assessments with risks (donut) Risk presence across assessments Share of assessments with at least one linked risk vs. with none (and, where tracked, with mitigations)
Assessment types with high inherent risks Where the riskiest findings originate Count of risks rated High (inherent) grouped by assessment type
Assessments with high risk and no mitigations (table) Gaps needing action Assessments containing ≥1 High risk with no linked mitigation
Average days to complete assessments Cycle time Average(completed_at – created_at) for completed assessments
Collaborated assessments status Participation of invitees Donut (e.g., Invited, Joined) based on collaboration state
Assessments activity (created vs completed) Volume over time Weekly/monthly counts (two series)

 

3) Vendor Assessments (TPRM)

Filters shown: Type & System

Why it matters
Provides a live picture of vendor due diligence, highlighting bottlenecks and unmitigated risks.

The vendor dashboard contains the same assessment-centric widgets as Compliance (above), but scoped to vendor assessments only. Calculations are identical, applied to vendor templates (e.g., SIG-style, security/privacy questionnaires). Key tiles you’ll see:

  • Draft Assessments

  • Drafts by assessment type

  • Assessments with risks

  • Assessment types with high inherent risks

  • Assessments with high risk and no mitigations (table)

  • Average days to complete assessments

  • Collaborated assessments status

  • Assessments by week (created vs. completed trend)

4) AI Governance

Filters shown: Type & System

Why it matters
Keeps AI projects accountable—tracking risks, collaboration, and governance progress.

Widget What it shows How it’s calculated
Draft Assessments AI-related assessments in progress Count with status = Draft
Drafts by assessment type Templates driving AI workload Bar chart grouped by AI assessment types
Assessments with risks Share of AI assessments that surfaced risks Same logic as Compliance: presence of ≥1 risk (and, where tracked, with/without mitigations)
Assessment types with high inherent risks Where the highest AI risks appear Count of High (inherent) risks grouped by AI assessment type
Assessments with high risk and no mitigations (table) Unmitigated AI risks Same as Compliance
Average days to complete / Collaboration / Activity Delivery speed & trend Same calculations as above
Collaborated assessments status Participation of invitees Donut (e.g., Invited, Joined) based on collaboration state
Assessments activity (created vs completed) Volume over time Weekly/monthly counts (two series)

5) Risk Management (Aggregate Risk View)

Filters shown: Name, Type

Why it matters
Provides a central risk register to track severity, ownership, and themes across the entire MineOS ecosystem.

Widget What it shows How it’s calculated
Total risks All risks currently tracked Count of risk records in scope
Risks distribution (severity) Severity mix Donut of counts by severity (e.g., Very High, High, Medium, Low, Very Low)
Incomplete risks Risks missing key steps Count of risks not yet “completed” (e.g., missing owner/mitigation or open state)
Risks by type (bar) Which risk categories dominate Counts grouped by risk taxonomy (e.g., External breach, Ethical risk, Legal risk, Security risk, etc.)
Risks by assessment type (donut) Where risks originate Counts grouped by source module/type (e.g., RoPA/PA, AI, Vendor, etc.)

Wrap-up

MineOS dashboards turn data into action. With consistent filters, exports, and scheduling, every module gives you instant visibility and evidence for compliance reporting.

Whether it’s DSRs, RoPA, vendors, AI systems, or risks, dashboards show you what’s open, what’s overdue, and where the risks lie—calculated automatically and always up to date.