Microsoft Project Online retires September 30, 2026, migrate to a modern platform before it's too late.Start migration
Deadline-bounded guide, ~21 weeks remaining

How to export Project Online data
before September 2026

The complete, honest export playbook for PMO admins. Every project, every custom field, every timesheet, before Microsoft turns off the lights on .

No signup required for the checklist. 35 items across 6 categories. Export-as-PDF.

Microsoft will not extend this deadline. The retirement was first announced in 2024 and reaffirmed in subsequent service update notifications. Tenants that miss the date lose read access to their data, not just write access. See Microsoft’s official notice on the Project Online service description.

What you’ll lose if you don’t export

On October 1, 2026, the tenant URL stops resolving. Whatever is still inside Project Online on that morning is effectively gone, not because the bytes are deleted on day one, but because you no longer have a supported way to read them. That means your project plans, your enterprise resource pool, your custom fields, your timesheet archive, your status report history, your project sites, and any Power BI reports built on the OData feed all disappear from your active toolchain.

For most PMOs this matters far more than they expect. Project plans are the most obvious loss, but they are also the easiest to migrate, because Microsoft’s native .mpp and MSPDI XML formats are widely supported by other tools. The painful losses are the ones that look like configuration: enterprise custom fields with their lookup tables and formulas; resource cost rates that have been refined over years; timesheet history that finance has been quietly relying on for capitalisation reporting; workflow definitions that encode actual approval logic. None of these survive on their own. They have to be exported, on purpose, by you, before the deadline.

The other category of loss is regulatory. PMOs in financial services, healthcare, energy, and government typically have records-retention obligations measured in years, not months. Project records, including completed timesheets and approved change requests, often fall inside those obligations. Losing access to that history is not just inconvenient; in some industries it is a finding. Treat the export as a records preservation exercise, not just a tool migration.

What can vs cannot be exported

Before you start exporting anything, build an inventory. Project Online has at least eleven distinct data classes, and each one has its own export path, format, and set of edge cases. The matrix below is the honest version, including the parts that do not export cleanly.

Data classStatusFormatNotes
Project plans (tasks, dependencies, dates)Exportable.mpp, MSPDI XMLFull export via PWA UI, PowerShell/CSOM, or third-party tools.
Enterprise resource poolExportableXML, OData, CSVResource attributes, calendars, and cost rates all preserved.
Enterprise Custom Fields (ECFs) + lookup tablesExportableXML, ODataDefinitions exported separately from per-project values.
Timesheet historyPartialOData, Reporting DBOData feed is the canonical path. Reporting DB has more history but needs SSMS access.
Status reports (PWA)PartialPDF, manual exportNo bulk API. Save each report to PDF or recreate from Reporting DB tables.
Project sites (SharePoint)PartialSharePoint backupUse Site Content + Library Export. Some web parts will not migrate; document libraries do.
Reporting database (RDB) snapshotPartialSQL .bak, CSVAvailable on-prem or via dedicated Project Online Reporting feature, ask your tenant admin.
Power BI dashboards built on RDB/ODataPartial.pbix filesExport the .pbix from Power BI Service. Dataset connection strings will need rewiring post-migration.
Project workflows (PWA-native)Lostn/aNo supported export. Document logic and rebuild in destination tool.
Demand management views, queue jobs, OLAP cubesLostn/aThese are PWA-runtime artefacts. Recreate as native dashboards post-migration.
Email alerts & subscriptionsLostn/aRecreate in destination notification system.

Two thirds of the categories are exportable, but the “partial” rows are where most of the rework lives. Plan for them now, not in week eight.

The pattern that catches most teams off guard is the gap between “the data is there” and “the data is exportable in a useful shape.” Project plans, resources, and custom field definitions are all first-class export targets, Microsoft and the wider ecosystem ship documented APIs and file formats for each. Timesheets and reporting feeds are also fully exportable, but they require pulling from the OData service in the right order, with the right filters, before the tenant rate-limit pushes back. PWA-native artefacts (workflows, demand-management views, OLAP cubes) have no supported export at all. They are runtime constructs that exist only inside the Project Server engine.

Three implications fall out of this. First, treat workflows as documentation work, not export work, screenshot every step, capture the conditions and approvers, and budget time to rebuild the logic in the destination tool. Second, treat OLAP cubes and PWA dashboards as an analytics-team conversation, not a migration-team conversation, because the right replacement is usually a refreshed Power BI or destination-native dashboard, not a one-to-one port. Third, accept that some email subscriptions and project-site web parts will be reproduced rather than migrated. None of this is failure, it is just the honest shape of the surface.

When you build the project plan for the export work itself, treat the “partial” rows as the long pole. They are the work items that need dedicated owners and explicit acceptance criteria, not bullet points buried inside a one-line task. Most teams spend two weeks on the “full” rows combined and four to six weeks on the “partial” rows, particularly SharePoint sites and the Reporting Database snapshot if that path applies to you.

21 weeks until deadline. If you have more than ten active projects, start with the inventory tool below and budget two full days for it.

Run inventory checklist

Export formats explained

Five formats cover the entire Project Online surface. Pick the right one per data class, do not try to use a single format for everything. The format choice drives both the export script you write and the destination tool you can later import into, so it pays to understand the trade-offs before you start moving bytes.

Two of the five formats are for plans, two are for tabular data, and one is for project sites. Almost every PMO ends up using at least three of them in parallel. The temptation to standardise on one (“everything as XML”) is real, and wrong, you trade a small amount of operational simplicity for a large amount of fidelity loss in the categories that do not naturally fit the chosen container. Two parallel exports per category (binary + open) is also a common pattern, the archive cost is negligible and the second copy doubles as your insurance against future tooling changes.

.mpp (Microsoft Project binary)

Per-project plan with full task, dependency, and assignment fidelity. Default when opened in Project Professional.

Strengths: Highest fidelity for plans. Native to Microsoft Project. Read by most third-party tools including Onplana.

Watch for: Binary, not human-readable. Cannot bulk-export from PWA without scripting. One file per project.

MSPDI XML (Microsoft Project Data Interchange)

Open-format equivalent of .mpp. Project → Save As → XML Format.

Strengths: Documented schema, easy to inspect or transform. Supported by every credible importer. Diff-able in source control.

Watch for: Slightly larger than .mpp. Some PWA-specific extended attributes need an explicit toggle to round-trip.

OData API (live REST feed)

Resources, custom fields, assignments, timesheets, project metadata. Tenant-wide pulls.

Strengths: No file conversion. Live, current data. Can be filtered, paged, and resumed. Streamable into any destination.

Watch for: Rate-limited per tenant. Requires admin credentials. Authentication via M365 OAuth (not basic auth).

CSV (manual or scripted)

Long-term archive for tabular data: timesheets, custom field lookup tables, status reports.

Strengths: Universally readable. Tiny on disk. Trivially diff-able. Survives any tool change.

Watch for: Loses relational structure. Per-row encoding gotchas. Use only when archive durability matters more than re-import fidelity.

SharePoint site export (.cmp)

Project workspace sites: documents, lists, custom pages, web parts.

Strengths: Captures everything in a single archive. Officially supported via PowerShell.

Watch for: Some web parts will not migrate. The .cmp format is SharePoint-specific, not portable to other DMSes without conversion.

Reporting Database snapshot

Long-tail reporting history. Mostly relevant for on-prem Project Server, less so for pure Project Online.

Strengths: Richest read model, includes derived fields not in OData. Familiar SQL access for analysts.

Watch for: On-prem only. Project Online tenants substitute the OData feed and accept the data shape difference.

Step-by-step: exporting project plans

Project plans are the largest single export and the one most teams start with. Three methods cover the realistic paths: the PWA UI for one-off projects, PowerShell with the Project CSOM module for bulk export, and a third-party extractor when the tenant has hundreds of plans and you want a single CLI invocation. Pick based on volume.

Method 1, PWA UI (for <20 projects)

  1. Sign in to your Project Online tenant as a user with Open Project permission for the projects you want to export.
  2. Open Project Center. Confirm the project list is the full set, filtering by department or owner if needed.
  3. Click into each project to open it in the browser editor, then choose File → Save As → Save As File. Pick Microsoft Project File (*.mpp).
  4. If your browser prompts to open in Project Professional rather than save, that is fine, save from Project Pro’s File → Save As menu instead.
  5. Repeat per project. Name files <project-id>-<short-name>.mpp so the source GUID is preserved in the filename.
  6. For each saved file, also export a parallel MSPDI XML via File → Save As → XML Format (*.xml). The XML version is your insurance against any future binary-format compatibility issue.

Gotcha: the browser editor cannot save .mpp directly in some tenant configurations. If the menu item is greyed out, you must open the project in Project Professional (the desktop app) first, which then routes the save through the desktop client.

Method 2, PowerShell + CSOM (for 20–500 projects)

Microsoft ships a Client-Side Object Model (CSOM) DLL for Project Online. Combined with PowerShell and a service account, this is the standard bulk export path.

  1. Install the latest Project Online CSOM NuGet package on your jump box. Reference the Microsoft.ProjectServer.Client assembly from PowerShell.
  2. Create a service account with Manage Users and Groups + Open Project permissions. Use modern auth (OAuth) only, basic auth is fully retired.
  3. Author a script that connects to https://<tenant>.sharepoint.com/sites/pwa, enumerates ProjectContext.Projects, and for each project calls Project.Draft.SaveProjectAs(localPath). Iterate with a Throttle of 1 request every 1.5 s to stay under tenant rate limits.
  4. For each project, also call the OData export endpoint /_api/ProjectData/Projects('<guid>')?$expand=Tasks,Assignments,Resources to capture the relational view alongside the .mpp.
  5. Capture a per-project log line: { projectId, guid, lastModified, taskCount, assignmentCount, fileBytes, sha256 }. This is your auditable manifest later.
  6. Run the script in a maintenance window. A 200-project tenant typically takes 4–8 hours end-to-end at safe throttle.

Method 3, Third-party extractor (for 500+ projects, or low-touch)

Several vendors ship Project Online extractors that wrap the OData + CSOM APIs in a single CLI or GUI. The trade-off is licence cost vs engineering time. For a tenant with 500+ projects, an extractor typically pays for itself in the script-authoring time it saves. Vet the tool against your security team before granting tenant-admin credentials, and confirm it can produce both .mpp and MSPDI XML outputs (some only do one).

Whichever method you use, validate by opening 5% of exported files in a clean Project Professional install and comparing the task count, finish date, and total work to the source. Deviations >1% mean the export configuration needs revisiting.

Test your export against a real destination

Upload one of your exported .mpp files to the Migration Preview tool. See exactly how it would land, task fidelity, dependencies, custom fields, all of it, before you commit.

Try Migration Preview

Free. No signup. Runs in your browser.

Step-by-step: exporting the resource pool

The enterprise resource pool is one of the most under-appreciated assets in Project Online. Years of resource definitions, custom resource fields, calendar exceptions, and cost rates often live nowhere else. Export it in two passes: a structural pass via PWA, then a data pass via OData.

  1. In PWA, open Resources. Filter by Generic, Active, Inactive separately; export each filter as CSV using the built-in Export to Excel button. This gives you the resource roster in three clean slices.
  2. For each resource, capture the calendar exceptions: open the resource → Resource Information → Change Working Time. Export the exception list to CSV manually (PWA does not bulk-export exceptions).
  3. Pull the OData feed at /_api/ProjectData/Resources?$expand=ResourceCustomFields. This returns every resource with all their enterprise custom field values in a single relational shape. Save as JSON or convert to CSV per resource.
  4. Cost rates: open each resource that has cost rate tables → Costs tab. Capture the five rate tables (A through E) and their effective dates. PWA has no bulk cost-rate export, so this is manual or via CSOM EnterpriseResource.CostRateTables.
  5. Validate: total resource count from CSV exports should match OData resource count exactly. Any drift indicates a permission or filter issue with the OData service account.

Export the resource pool BEFORE the project plans. If you do project plans first and discover a resource permission issue when you get to the resource step, you may need to re-export plans to pick up resource GUIDs that resolve correctly in the destination.

Once exported, the re-modelling and capacity-view migration into the target platform have their own playbook. See the Resource Capacity Planning After Project Online pillar for the platform-by-platform model coverage, the five-phase migration process, and post-migration validation checks.

Step-by-step: exporting custom fields & lookup tables

Enterprise Custom Fields (ECFs) are the silent killers of half-finished migrations. The field values ship inside .mpp and MSPDI XML. The field definitions(data type, lookup table, formula, rollup behaviour) live at the tenant level and do not. Export them separately, in their own pass, and treat them as schema rather than data.

  1. In PWA → Server Settings → Enterprise Custom Fields and Lookup Tables, capture screenshots of the full list for visual reference. (PWA has no native export-all button at this level.)
  2. For each lookup table, click into it and either Export to Excel from the table view or pull it via OData at /_api/ProjectData/LookupTables('<name>')/Entries. Save each table as a separate CSV with a stable filename.
  3. For each ECF, document: name, data type (Text / Number / Date / Cost / Duration / Flag), entity (Project / Task / Resource), lookup table reference, formula expression, rollup behaviour, default value. A simple spreadsheet column-per-attribute works.
  4. Also pull the OData ECF index at /_api/ProjectData/CustomFields for completeness. This returns the same metadata as a single relational dump, useful as the canonical machine-readable record.
  5. For ECFs with formulas, verify the formula expression in plain text. The formula syntax is Microsoft Project specific, you will need to translate it to your destination tool’s formula syntax during import.

A practical sequencing tip: pull the lookup tables first, then the field definitions, then the per-project values. Lookup tables are essentially controlled vocabularies, they almost never change without governance approval, so an early snapshot is safe. Field definitions reference the lookup tables, so they need to be exported with the table snapshot in hand. Per-project values reference the definitions, so they come last. If you reverse the order and pull values first, you risk an inconsistency window where a lookup row gets renamed mid-export and half your value snapshots reference the old token while the other half reference the new one. Hours of reconciliation later, the export is intact, but it cost a day.

Pay particular attention to ECFs that use formulas. Formula expressions are not portable, the syntax is Microsoft Project specific, and even within the Microsoft ecosystem the formulas do not round-trip cleanly between PWA and Project for the Web. Plan to translate every formula by hand into the destination tool’s expression language. For destinations that do not support formulas, decide up front whether the formula values are needed at import time (in which case capture the computed values alongside the source inputs and bring them across as static fields) or whether they are derived metrics you can recompute post-import.

Detailed coverage of the per-field migration mechanics, including how Onplana’s field mapper handles ECF translation, is in our custom fields migration deep-dive.

Step-by-step: exporting reporting & timesheet data

Reporting and timesheets are the most likely categories to be needed years after the tenant goes dark, for audits, for capitalisation reviews, for HR queries. Export them with that retention horizon in mind.

OData reporting feed

  1. Identify the OData entities you depend on. The full set is documented at ReportData OData reference. Common picks: Projects, Tasks, Assignments, Resources, TimeSet, TimesheetLines, IssuesAndRisks.
  2. Use $top=5000 + $skiptoken paging for each entity. The OData service throttles at the tenant level, plan for one entity at a time and do not run more than two endpoints concurrently.
  3. Save each entity as JSON (lossless) AND as CSV (for human + Excel access). Two formats double your storage but cuts your future regret in half.
  4. Capture the OData metadata document at /_api/ProjectData/$metadata. This XML schema is your decode ring for any future re-import attempt.

Timesheet history

  1. Identify the time period you need to retain. Most regulators require 7 years; finance often wants 10. Default to longest of the applicable retention requirements.
  2. Pull /_api/ProjectData/TimesheetPeriods?$expand=Timesheets($expand=Lines) in monthly batches. This is the only path that gives you the line-item time entries with project + task + resource GUIDs intact.
  3. Save per-period as a single JSON file plus a flattened CSV. Hash each file (SHA-256) and store the hash alongside the file in a separate manifest.
  4. If you have a Reporting Database (on-prem Project Server only), additionally back up the MSP_TimesheetHistory_* tables via SSMS → Tasks → Generate Scripts (Schema and Data). Useful when the OData TimeSet entity has been thinned out by tenant retention policy.

Status reports + Power BI

  1. Status reports stored in PWA have no bulk export. Either save each report to PDF manually, or query the underlying tables via OData if your tenant exposes them, or accept the loss and recreate the reporting cadence in your destination tool.
  2. For Power BI dashboards built on the Project Online OData feed, download the .pbix file from Power BI Service → ... → Download this report. The connection string inside the .pbix points at your old tenant and will need rewiring after migration, but the visuals, measures, and bookmarks all survive.
  3. Document the dashboard inventory: report name, owner, last refresh date, OData endpoints it consumes. This is what you hand to whoever rebuilds the reports in the destination.

Validation: did your export actually capture everything?

Exports look fine until you try to read one back six months later and discover a silent truncation. Run the following validation checklist BEFORE you decommission the tenant, while you still have the source to reconcile against.

  • Project count: total .mpp files = total Projects entity rows in OData = total project rows in PWA Project Center.
  • Per-project task count: sample 10% of projects, open the .mpp, count tasks, compare to the OData Tasks count for that project GUID.
  • Resource count: CSV roster total = OData Resources count = PWA Resources page total.
  • Custom field schema: every ECF in PWA appears in your ECF spreadsheet AND in the OData CustomFields dump.
  • Lookup tables: per-table row count in PWA = per-table CSV row count.
  • Timesheet completeness: for each retained month, OData TimesheetPeriods row exists AND Lines export is non-empty for periods with submitted timesheets.
  • File hashes: SHA-256 of every export file recorded in the manifest. Re-hash after copy to long-term storage to confirm no transit corruption.
  • Round-trip test: open 5% of exported .mpp files in a clean Project Professional install. Confirm task count, finish date, and total work match source within 1%.
  • Round-trip test (destination): import 5% of exports into the chosen destination tool. Confirm dependencies and custom field values land cleanly.

21 weeks until deadline. Validation typically uncovers 2–3 issues per 100 projects. Allow a full week for re-export and re-validation after the first pass.

Storage & retention: where to keep exports

Exports are static, infrequently read, and need to outlive the tools that produced them. That points at cold object storage, not a SharePoint library. The right answer for most PMOs is a tiered approach: a hot copy for the first 6–12 months (active migration window), an archive copy for the regulatory retention horizon.

  • Hot tier (active migration): the storage attached to your destination tool, plus a working copy on a managed file share or shared drive your migration team can read.
  • Archive tier (long retention): Azure Blob (cool/archive), AWS S3 Glacier, or your existing on-prem records management system. All three give you 7+ year retention at pennies per GB-month.
  • Geographic redundancy: at least one copy in a different region from your primary. Cheap insurance against a regional outage when you need the records years from now.
  • Manifest + checksums: alongside the exports, store a CSV manifest of every file, its source project ID, last-modified date, and SHA-256 hash. This is what an auditor will ask for.
  • Encryption + access control: exports contain proprietary PMO data. Enforce server-side encryption at rest and restrict read access to a documented role.

Two patterns to avoid: keeping the only copy on a single team member’s laptop (one stolen device away from a permanent loss), and storing exports inside the same SharePoint or OneDrive tenant the source data lived in (creates a coupling between the records you need to retain and the platform you are decommissioning, which is exactly the situation that motivated the export in the first place). Cold object storage in a separately-billed account is the boring, correct answer. If your organisation already operates a records-retention platform with legal hold support, push the exports through that channel rather than spinning up a new bucket, the documented chain-of-custody is what auditors care about, not the storage tier.

Plan to revisit the archive once a year, even after migration is complete. Run a spot-check restore against 1% of the manifest, confirm files are still readable, confirm hashes still match. This is cheap to do and catches silent storage-tier corruption (rare but documented in long-tail Glacier and Blob archive incidents) before a real audit makes it expensive.

After export: migration paths

Export is the prerequisite, not the migration. Once you have validated exports in cold storage, the next decision is the destination. The honest options are: (a) the new Microsoft Planner / Project for the Web stack, (b) a third-party PM platform that reads .mpp natively, or (c) an interim archival posture with no live destination.

Onplana sits in option (b). The migration wizard accepts .mpp, MSPDI XML, and live OData connections directly from your Project Online tenant. If you want the full destination-agnostic playbook, see the Project Online Migration Complete Guide (the sibling pillar covering timeline, the five real migration paths, and post- migration validation), or the parent migration overview. If you are weighing Onplana against the other dedicated Project Online replacement vendor, the Onplana vs OnePlan comparison covers the architecture, pricing, and feature-coverage differences. If you are still building the executive case, the CFO-proof business case walks you through the pitch.

For broader context on the deadline and what is changing, see our Project Online end-of-life primer or the focused "Project Online retiring" announcement post. Deadline-specific tactics live in the data-export deadline canonical post.

Custom fields and lookup tables warrant their own playbook because they live at the tenant level and need to be exported separately from project-level data. Our custom fields migration deep-dive covers the schema export, formula re-implementation in the destination platform, and the per-project value reconciliation pass.

FAQ

Don’t leave it to the last week

147 days left until Project Online retires. Inventory now, export by mid-summer, validate by Labor Day. You will sleep better than the PMOs that wait.

. The date will not move.