Protocol Architecture
Methexis combines on‑chain coordination with off‑chain storage and compute.
Ethereum smart contracts keep the canonical record for datasets, model versions, rewards, slashing, and governance; heavy data and ML work run off‑chain on community hardware and decentralized storage.
1) On‑Chain Components (smart contracts)
Token Contract ($MTHX) ERC‑20 with additional hooks for staking and reward distribution. Locking stake gives access to validator roles and enables slashing when rules are violated.
Data Registry Contract Stores metadata only (e.g., IPFS/Arweave content hash, submitter, status flags). Each dataset is registered as pending and later marked accepted/rejected by the validation mesh.
Training Coordination Contract Orchestrates rounds: forms validator committees via VRF, accepts model‑update proposals (hash + metrics + signatures), verifies stake‑weighted thresholds, finalizes the new currentModelHash, and triggers reward distribution.
Governance (DAO) Contract Token‑weighted proposals, voting, timelocks, and parameter updates (e.g., committee size, stake thresholds, reward split). Initial governance parameters include proposal thresholds, quorum, approval/supermajority requirements, voting period, and execution timelock.
Design goal: Keep on‑chain state minimal—store content hashes, model hashes, statuses, events; keep raw data and weights off‑chain for cost and scalability.
2) Off‑Chain Components (storage & compute)
Decentralized Storage Datasets and model artifacts are stored on IPFS/Arweave. The Data Registry keeps their content IDs so any node can fetch and verify integrity against the on‑chain hash.
Training Validators (compute providers) Community‑run nodes that listen to Training‑Contract events, fetch approved data/model checkpoints, run training jobs, then co‑sign a model update and submit it on‑chain. Staking aligns incentives.
Validation Workers (data‑validation mesh) Stake‑selected committees run automated pre‑filters (license, anomaly, duplication, policy checks), produce a quality score, and write acceptance results back to the registry. Randomized selection + stake make Sybil attacks costly.
3) End‑to‑End Workflow
Contribute — Submit dataset to IPFS/Arweave and register its hash in the Data Registry as pending. A small stake/deposit can be required.
Validate — A randomized, stake‑weighted committee evaluates the dataset; status becomes accepted/rejected on‑chain.
Train — When a round starts, validators fetch the latest model & approved data, run training, and co‑sign a candidate update.
Finalize — The Training Contract verifies signatures/stake thresholds, updates currentModelHash, emits an event, and distributes rewards; slashing applies for misbehavior or non‑participation.
A (configurable) challenge window can be enabled as a safety hatch: if someone proves a violation (e.g., collusion or invalid update), the proposal is rejected and offenders are slashed.
4) Staking, Selection & Slashing
Staking thresholds (illustrative): e.g., 100,000 $MTHX to register as a validator; smaller bonds for dataset submission. Final numbers are governed.
Committee selection: randomized via VRF; to sway outcomes an attacker must control a majority of stake in the committee—costly and unreliable.
Slashing: penalties for false computation, chronic downtime, or malicious data; slashed tokens can be burned, redistributed, or sent to treasury per policy.
5) Rewards & Emissions (where executed)
When a round is accepted, contracts compute rewards on‑chain and transfer $MTHX to: training validators, data providers, and validation workers; a small maintenance/treasury cut (e.g., ~3%) may fund shared costs (pinning, audits). Emission curve is encoded/controlled per tokenomics v2.
6) Security Model (summary)
Sybil resistance: staking + randomized committees.
Consensus & integrity: stake‑weighted signature thresholds on model updates; strict round/prev‑hash checks to avoid replays.
Challenge path (optional): short fraud‑proof window to contest invalid updates.
Governance safeguards: proposal thresholds, quorum, supermajority for critical changes, timelocks, and gradual decentralization.
7) Example Interfaces (illustrative)
On‑chain (solidity‑style sketches):
// Data Registry
registerDataset(bytes32 contentHash, string metaURI) -> datasetId
setDatasetStatus(uint256 datasetId, Status {Pending, Accepted, Rejected})
// Training Coordination
startRound(uint256 roundId)
submitModelUpdate(uint256 roundId, bytes32 modelHash, bytes signatures[])
finalizeRound(uint256 roundId) -> distributes rewards, updates currentModelHash
// Staking / Slashing
stake(uint256 amount); unstake()
slash(address offender, uint256 amount, Reason reason)
// Governance (DAO)
propose(bytes calldata callData, string description)
vote(uint256 proposalId, bool support, uint256 weight)
queue(uint256 proposalId); execute(uint256 proposalId)
Events (minimal schema):
DatasetRegistered(datasetId, contentHash, submitter)
DatasetStatusChanged(datasetId, status)
RoundStarted(roundId, prevModelHash)
ModelProposed(roundId, modelHash, signers, metricsURI)
ModelFinalized(roundId, modelHash)
RewardsDistributed(roundId, total, breakdownURI)
Slashed(account, amount, reason)
These interfaces illustrate how we keep hashes and statuses on‑chain and large artifacts off‑chain (URIs/hashes).
8) Telemetry & Observability
Validators expose basic metrics (uptime, participation, result agreement). A simple on‑chain/off‑chain dashboard lets stakers and governance monitor reliability, latency, and throughput. (Implements the “transparent performance” principle from the whitepaper.)
9) Future: Cryptographic Verification (zk‑ML)
The roadmap contemplates zk‑proofs or lighter spot‑check schemes (VRF‑chosen minibatches, committed intermediates) so a proposer can prove “the update was computed correctly” without everyone re‑running training. Adoption depends on maturity of zk‑ML.
Appendix: Parameter Surface (governed)
Committee size; signature threshold (M‑of‑N / stake‑weight %).
Stake thresholds; slashing percentages; cooldowns.
Reward split targets; maintenance/treasury rate; emission curve.
Governance thresholds: proposal, quorum, supermajority; timelock durations.
Last updated