MethodologyResultsServer MetadataNetwork TopologyHoneypot IndicatorsAbout

Methodology

The Electrum Observatory uses an active-measurement approach to analyze the structure, behavior, and privacy properties of the Electrum server ecosystem. All scanning and data collection interact only with publicly reachable servers, and no user data is generated or accessed at any point.

1. Overview

The methodology combines network discovery, metadata extraction, behavioral fingerprinting, certificate analysis, and clustering techniques. The goal is to detect centralization patterns, unusual server behavior, and potential surveillance or honeypot indicators across the Electrum ecosystem.

Seed Server Selection

Electrum servers do not maintain a peer-to-peer structure among themselves. Instead, each server publishes a list of recommended peers through theserver.peers.subscribeRPC method. Network discovery therefore begins by querying a single seed server, which returns the initial set of peers.

The Electrum Observatory uses a recursive discovery crawler that starts from one seed server, connects to each returned peer, and continues the process up to a predetermined depth. The seed host acts as the entrypoint into the public Electrum network.

Seeds Used in This Study

  • Primary Seed: electrum3.bluewallet.io:50002
  • Fallback Seed: electrum.blockstream.info:50002

Choosing well-maintained and stable seed servers ensures consistent discovery and avoids topological bias caused by unreliable or misconfigured nodes.

Electrum network discovery flow diagram

2. Network Discovery

Once the seed server is selected, the crawler retrieves its peer list and recursively connects to each peer. This process yields the widest reachable view of the Electrum network without relying on external or crowdsourced server lists.

  • Retrieve peers via server.peers.subscribe
  • Connect using SSL first; fallback to TCP if necessary
  • Extract basic metadata on connection
  • Queue each discovered peer until depth limit is reached
Global Network Map

3. Metadata Scanning

Once connected to a server, the scanner records protocol information, TLS certificate details, latency, feature flags, and capabilities. All metadata is stored with timestamps for longitudinal analysis.

  • Banner string & server software version
  • Protocol negotiation level
  • SSL certificate fingerprint & issuer
  • Supported features (e.g., mempool status)
  • Latency and timing characteristics
Methodology cycle

4. Behavioral Fingerprinting

The crawler performs controlled probes to evaluate how each server responds to various address formats, malformed queries, and repetitive requests. Deviations from standard Electrum behavior may indicate modified implementations, analytics filters, or honeypot-like behavior.

  • Standard queries (balance, history, headers)
  • Address type variation (P2PKH, P2WPKH, Taproot)
  • Malformed RPC injection
  • Xpub-derived scripthash queries
  • Timing variance and latency fingerprinting

5. Data Normalization & Storage

Raw server responses are merged, standardized, and converted into structured datasets for reproducible analysis. TLS fields, behavioral measurements, and GeoIP attributes are normalized across all runs.

  • Deduplication of multiple scan passes
  • Certificate field canonicalization
  • Behavioral hash computation
  • GeoIP attribution

6. Honeypot Indicators

Servers exhibiting suspicious patterns — such as certificate reuse, malformed JSON signatures, unusual timing behavior, or selective failure cases — are flagged for deeper analysis. The goal is not to accuse operators, but to identify anomalous infrastructure that warrants investigation.

Honeypot Indicators

7. Output & Reporting

Final outputs include the global server dataset, behavioral fingerprints, certificate analysis, clustering results, and privacy-risk assessments. All reports are available as Jupyter notebooks and visual summaries.

8. Ethical Standards

All scanning adheres to strict ethical guidelines. Only publicly reachable Electrum servers are queried. No deanonymization techniques are used, and no wallet or user data is collected. The objective is transparency — not exploitation.