Claude built me a scanner to find perfect wallets for copy trading. heres what we got
i described what i needed to Claude - scan any Polymarket market, pull every wallet that traded on it, analyze their full history, classify by trading type, score them for copy-trading potential. working scanner with visual UI in one day
but the interesting part isn't the tool. it's what we found when we actually ran it - 90% of wallets on the Polymarket leaderboard will lose you money if you copy them
Polymarket does over $1B in monthly volume right now. thousands of wallets trade every day. the leaderboard shows top performers by PnL - and the first instinct of any copy-trader is to go to the leaderboard, find the wallet with the biggest profit and start copying
i did that. result - minus $340 in two weeks
the problem isn't that the leaderboard lies. the numbers are real. the problem is that PnL by itself tells you nothing about whether you should copy that wallet. $500K in profit could be one lucky bet on elections. or it could be 200 consistent positions at $2-5K each
these are two completely different traders. and if you copy the first one - you'll copy his next $300K bet into a single market. good luck
i decided to approach this systematically and wrote a script that analyzes wallets through the Polymarket API and classifies them by trading type. in this article - three types i found, why only one of them is worth copying, and the code that does all the work for you
how to tell traders apart: 5 metrics
before classifying - you need to know what to look at. i use 5 metrics for every wallet:
1. win rate - percentage of profitable positions out of all closed ones
2. position sizing CV - coefficient of variation of position sizes. that's standard deviation divided by the mean. CV = 0.5 means the trader bets roughly equal amounts. CV = 3.0 means one trade is $50, the next one is $50,000
CV = std(position_sizes) / mean(position_sizes) CV < 0.5 -> very consistent sizing CV 0.5-1.5 -> normal spread CV > 2.0 -> chaotic, bets are random
3. number of markets - how many different markets the wallet trades on. 3 markets = concentrated bets. 30+ markets = diversification
4. max concentration - what percentage of the portfolio is in the single largest position. 60% in one market - that's not trading, that's a casino
5. PnL % - profit as a percentage of total invested. $10K PnL on a $1M portfolio is +1%. $10K PnL on $50K is +20%. the second trader is 20x more efficient
all of this data can be pulled through the public Polymarket API - no keys, free
type 1: whale gambler
you open the leaderboard and see a wallet with PnL +$200K. you go to the profile - 5 positions. average size $150,000. one market takes up 60% of the portfolio
- few positions (usually < 15)
- huge amounts ($50K+ per trade)
- max concentration > 30% in one market
- win rate is irrelevant (with 5 trades statistics are meaningless)
- sizing CV is high or irrelevant
here's how it's detected in code:
def classify_trader(positions, markets, win_rate, size_cv, max_conc, invested, pnl_pct):
# whale gambler: few markets, high concentration, big amounts
if max_conc > 0.3 and positions < 15 and invested > 10000:
return "whale_gambler"
real example from the Russia x Ukraine ceasefire market ($12M volume):
wallet: 0xa53e... positions: 5 markets: 5 invested: $156,390 PnL: -$30,269 (-19.4%) win rate: 0% max position: $94,734 (60% of portfolio) sizing CV: 1.08
5 positions, zero wins, 60% of portfolio in one bet. classic whale gambler. if you started copying him a month ago - you'd have lost 19% of your bankroll
why you can't copy him: a whale gambler has no edge. he has money and an opinion. sometimes the opinion turns out to be right and he shows up on top of the leaderboard. but this is survivorship bias - you see those who guessed right, not the hundreds of identical ones who lost everything
expected result of copying a whale gambler: - you repeat his $50K bet into one market (proportional to your bankroll) - 50% of the time it works, 50% it doesn't - but you can't survive a series of 2-3 losses because positions are too large - drawdown 40-60% after the very first loss
type 2: scalper
different pattern - a wallet with 500+ positions but average size $20-50. lots of small trades, fast turnover. in the profile you see hundreds of rows, often on the same markets - bought, sold, bought, sold
- many positions (50+)
- small average size (< $100 per trade)
- frequently buys and sells on the same market
- sizing CV is usually high (> 2.0) because scaling is chaotic
- PnL can be positive but margins are thin
# scalper: many positions but small average size
if positions > 50 and invested / positions < 100:
return "scalper"
why you can't copy him: a scalper makes money on speed. he sees the market moving, buys, waits 5-10 minutes for the price to catch up, sells. his edge is timing
when you copy a scalper through a bot - you enter with a delay. even 3-5 seconds on a copy-trading bot is already edge loss:
example: scalper buys Yes at 52c, edge = 8c (target price 60c) delay -> entry price -> your edge -> loss 0 sec 52c 8c 0% 3-5 sec 53c 7c -12.5% 10-15 sec 56-57c 3-4c -50%+ scalper makes $4 on a $100 trade you make $2 after delay minus fees - you're at zero or negative
a scalper is like a Formula 1 driver. he can do the track in 1:20. but if you repeat his line with a half-second delay on every turn - you'll finish last or crash
type 3: systematic trader
and there's the third type. a wallet with 30-200 positions, all roughly the same size ($2-10K), across 20+ different markets. win rate 55-70%. PnL grows smoothly, without one giant spike
- 20+ positions (enough data to evaluate)
- 10+ different markets (diversification)
- win rate 55%+ (real edge exists)
- sizing CV < 1.5 (consistent position sizes)
- max concentration < 40% (no all-in bets)
- PnL grows steadily, not in one jump
# systematic: good win rate + consistent sizing + diversification
if win_rate > 55 and size_cv < 1.5 and markets > 10:
return "systematic"
why this one is worth copying: a systematic trader has edge that doesn't depend on speed. he analyzes the market, enters a position and holds it for days-weeks. a 3-5 second delay when copying doesn't eat his edge because he trades on a timeframe of days, not minutes
example: systematic trader buys Yes at 40c on "ceasefire by end of 2026" he calculates the real probability at 55%, edge = 15c position size $5,000 delay -> entry price -> your edge -> loss 0 sec 40c 15c 0% 3-5 sec 40.01c 14.99c ~0% 10-15 sec 40.05c 14.95c ~0.3% copy-trader's edge is nearly identical to the original
delay when copying doesn't matter when the trader holds positions for weeks. this is the key difference
how i find systematic traders: the script
i wrote a script in Python that searches for wallets through the public Polymarket API and filters them by systematic trader criteria
step 1: get market data
the Polymarket API is public and free. two endpoints give you everything you need:
GAMMA_URL = "https://gamma-api.polymarket.com"
DATA_URL = "https://data-api.polymarket.com"
# get market by slug (from the URL on polymarket.com)
def get_market_data(slug):
resp = requests.get(f"{GAMMA_URL}/events?slug={slug}")
events = resp.json()
markets = events[0]["markets"]
return [{
"question": m["question"],
"conditionId": m["conditionId"],
"volume": m["volume"]
} for m in markets]
the market slug is the part of the URL. for example for polymarket.com/event/russia-x-ukraine-ceasefire-before-2027 the slug = russia-x-ukraine-ceasefire-before-2027
step 2: find wallets that trade on the market
# download last 500 trades on the market
def get_trades(condition_id):
resp = requests.get(
f"{DATA_URL}/trades",
params={"market": condition_id, "limit": 500}
)
return resp.json()
# aggregate by wallets
def extract_wallets(trades):
wallets = {}
for t in trades:
addr = t["proxyWallet"]
if addr not in wallets:
wallets[addr] = {"trades": 0, "volume": 0}
wallets[addr]["trades"] += 1
wallets[addr]["volume"] += t["size"]
return sorted(wallets.items(), key=lambda x: x[1]["volume"], reverse=True)
every trade contains proxyWallet - the wallet address, size - size in USDC, side - BUY or SELL. we aggregate trades by wallet and sort by volume
step 3: analyze each wallet
the most important endpoint is positions. it returns all positions of a wallet across all markets:
def get_wallet_positions(address):
resp = requests.get(
f"{DATA_URL}/positions",
params={
"user": address,
"limit": 500,
"sortBy": "CASHPNL",
"sortOrder": "desc"
}
)
return resp.json()
initialValue- how much was investedcurrentValue- how much it's worth nowcashPnl- profit/loss in USDCavgPrice- average entry priceconditionId- which markettitle- market name
from this we calculate all 5 metrics:
def analyze_wallet(address):
positions = get_wallet_positions(address)
wins = sum(1 for p in positions if p["cashPnl"] > 0)
losses = sum(1 for p in positions if p["cashPnl"] < 0)
win_rate = wins / (wins + losses) * 100
sizes = [p["initialValue"] for p in positions]
avg_size = sum(sizes) / len(sizes)
std_dev = (sum((x - avg_size)**2 for x in sizes) / len(sizes)) ** 0.5
size_cv = std_dev / avg_size # coefficient of variation
markets = len(set(p["conditionId"] for p in positions))
max_concentration = max(sizes) / sum(sizes)
total_pnl = sum(p["cashPnl"] for p in positions)
return {
"win_rate": win_rate,
"size_cv": size_cv,
"markets": markets,
"max_concentration": max_concentration,
"total_pnl": total_pnl,
"positions": len(positions)
}
step 4: filtering
# criteria for a "good" wallet for copying
MIN_POSITIONS = 20 # minimum positions for evaluation
MIN_MARKETS = 10 # minimum different markets
MIN_WIN_RATE = 55.0 # win rate %
MAX_SIZE_CV = 1.5 # sizing consistency
MIN_INVESTED = 1000 # filter out bots with $1
MAX_CONCENTRATION = 0.4 # max 40% in one position
def is_good_wallet(stats):
return (
stats["positions"] >= MIN_POSITIONS
and stats["markets"] >= MIN_MARKETS
and stats["win_rate"] >= MIN_WIN_RATE
and stats["size_cv"] <= MAX_SIZE_CV
and stats["max_concentration"] <= MAX_CONCENTRATION
)
out of 25 wallets on a single market usually 2-4 pass all filters. most fail on win rate or sizing CV
scoring: who to copy first
when you have 3-4 wallets that passed the filters - you need to pick the best one. i use a score from 0 to 100:
def compute_score(stats):
score = 0
# win rate (0-30 points)
if stats["win_rate"] >= 70: score += 30
elif stats["win_rate"] >= 60: score += 25
elif stats["win_rate"] >= 55: score += 20
# PnL % (0-25 points)
pnl_pct = stats["total_pnl"] / stats["total_invested"] * 100
if pnl_pct > 20: score += 25
elif pnl_pct > 10: score += 20
elif pnl_pct > 5: score += 15
# consistency (0-20 points)
if stats["size_cv"] < 0.5: score += 20
elif stats["size_cv"] < 1.0: score += 15
elif stats["size_cv"] < 1.5: score += 10
# diversification (0-15 points)
if stats["markets"] >= 30: score += 15
elif stats["markets"] >= 20: score += 12
elif stats["markets"] >= 10: score += 8
# concentration penalty
if stats["max_concentration"] > 0.5: score -= 15
elif stats["max_concentration"] > 0.3: score -= 10
# bonus for data quantity
if stats["positions"] >= 50: score += 10
elif stats["positions"] >= 30: score += 7
return max(0, min(100, score))
score 70+ - excellent candidate for copying score 50-70 - can copy with caution score below 50 - better to skip
the full pipeline
putting it all together. enter a market slug - get a list of wallets with classification and score:
def scan_market(slug, top_n=25):
# 1. get market
markets = get_market_data(slug)
# 2. collect all trades and find unique wallets
all_wallets = {}
for m in markets:
trades = get_trades(m["conditionId"], limit=500)
for addr, stats in extract_wallets(trades):
if addr not in all_wallets:
all_wallets[addr] = stats
# 3. take top-N by volume
top = sorted(all_wallets.items(), key=lambda x: x[1]["volume"], reverse=True)[:top_n]
# 4. analyze each one
results = []
for addr, _ in top:
stats = analyze_wallet(addr)
stats["type"] = classify_trader(stats)
stats["score"] = compute_score(stats)
results.append(stats)
# 5. filter and sort
good = [r for r in results if is_good_wallet(r)]
good.sort(key=lambda x: x["score"], reverse=True)
return good
the full script (with multi-market scanning, rate limiting and clean output) is available as a separate file - link at the end
you can scan multiple markets at once:
slugs = [
"russia-x-ukraine-ceasefire-before-2027",
"presidential-election-winner-2024",
"fed-decision-in-january",
]
for slug in slugs:
results = scan_market(slug, top_n=20)
the more markets you scan - the more unique wallets you find. the same systematic trader can be trading on 10+ markets simultaneously
what to do after you find a wallet
found 3-4 wallets with a high score. next steps:
- paper trading - put them on tracking and record trades in a spreadsheet without real money. observe for a week or two
- check on hashdive - go to hashdive.com, enter the address, look at the PnL charts. you need a smooth upward curve, not a single spike. hashdive shows history that's not easy to pull through the API
- copy on small amounts - start with $20-50 per trade. if after 2 weeks the result matches what the script showed - scale up
- monitoring - a systematic trader can stop being systematic. win rate can drop, sizing can become chaotic. you need to run the script regularly
i'm not publishing the wallets that passed my filters here for obvious reasons - the more people copy one wallet the less edge each copy-trader has. but the script is open - you can run it and find your own
summary
the only type worth copying is systematic. his edge doesn't depend on entry speed, his sizing is predictable, his results are reproducible
the script does all the work: finds wallets, analyzes, classifies, filters, scores. all you need to do is pick a market and run it
once i find the wallets i want to copy - i track and execute through Kreo. it monitors wallet activity in real time and copies trades automatically the moment a position opens. i've been using it for 3 weeks now and the latency is consistently under 5 seconds which is exactly what you need for systematic traders
Kreo bot: https://t.me/KreoPolyBot?start=ref-danko
all code works through the public Polymarket API - no keys, no registration
built with Claude - from idea to working scanner in one session