RankBoost
Find the best App Store keywords to rank higher and get more downloads
Building a product entirely dependent on scraping a platform that actively fights scrapers. Apple doesn't provide a public API for search rankings, so every ASO tool is built on scraped data. But as a solo dev, I couldn't sustain the scraping infrastructure — cost, rate limits, and anti-bot measures made it a constant arms race. Enterprise competitors spend six figures on this infrastructure. I was spending $300/month and hoping for the best.
2/10 revival potential
Timeline
The story
What was built
RankBoost was a web-based App Store Optimization (ASO) tool for indie iOS developers. You'd enter your app's name and category, and it would analyze keyword competition, suggest better keywords, show ranking estimates, and track your position over time. The tool scraped App Store search results daily to build a keyword difficulty database. There was a free tier (5 keyword lookups) and a paid tier ($49.99 one-time for unlimited lookups and position tracking). I targeted indie developers who couldn't afford enterprise ASO tools like Sensor Tower or App Annie.
Why they built it
When I launched my own iOS app, I spent hours manually searching App Store keywords and guessing which ones to target. Enterprise ASO tools cost $200+/month — way too expensive for a solo developer with a $2.99 app. I figured there was a market for a cheap ASO tool aimed at indie developers. I knew the App Store algorithm favored keyword optimization, and most indie devs were guessing instead of using data.
What worked
The keyword difficulty scoring was useful and indie devs appreciated having data they couldn't get elsewhere for free. My blog post on 'ASO for Indie Developers' ranked well on Google and drove consistent organic traffic. The one-time pricing was attractive to indie devs who hated subscriptions. I got some genuine thank-you emails from developers who said RankBoost helped them find keywords they hadn't considered.
What failed
Three things killed it. First, Apple changed their App Store search algorithm in a mid-year update, and my keyword difficulty scores became wildly inaccurate overnight. Rankings that my tool predicted as 'easy' turned out to be impossible, and vice versa. Recalibrating took weeks because I had to re-scrape and re-model everything. Second, Apple started rate-limiting and blocking the scraping infrastructure I used to gather keyword data. I burned through IP addresses and proxies, which added cost and fragility. Third, my data was always stale — I could afford to update rankings once daily, while enterprise competitors had near-real-time data. Paying users noticed that tracked rankings were often 12-24 hours behind actual results, which undermined trust.
What was validated
The keyword difficulty scoring was useful and indie devs appreciated having data they couldn't get elsewhere for free. My blog post on 'ASO for Indie Developers' ranked well on Google and drove consistent organic traffic. The one-time pricing was attractive to indie devs who hated subscriptions. I got some genuine thank-you emails from developers who said RankBoost helped them find keywords they hadn't considered.
Key lesson
Building a product entirely dependent on scraping a platform that actively fights scrapers. Apple doesn't provide a public API for search rankings, so every ASO tool is built on scraped data. But as a solo dev, I couldn't sustain the scraping infrastructure — cost, rate limits, and anti-bot measures made it a constant arms race. Enterprise competitors spend six figures on this infrastructure. I was spending $300/month and hoping for the best.
Failure analysis
What the signals looked like
The keyword difficulty scoring was useful and indie devs appreciated having data they couldn't get elsewhere for free. My blog post on 'ASO for Indie Developers' ranked well on Google and drove consistent organic traffic. The one-time pricing was attractive to indie devs who hated subscriptions. I got some genuine thank-you emails from developers who said RankBoost helped them find keywords they hadn't considered.
Where it actually broke
Three things killed it. First, Apple changed their App Store search algorithm in a mid-year update, and my keyword difficulty scores became wildly inaccurate overnight. Rankings that my tool predicted as 'easy' turned out to be impossible, and vice versa. Recalibrating took weeks because I had to re-scrape and re-model everything. Second, Apple started rate-limiting and blocking the scraping infrastructure I used to gather keyword data. I burned through IP addresses and proxies, which added cost and fragility. Third, my data was always stale — I could afford to update rankings once daily, while enterprise competitors had near-real-time data. Paying users noticed that tracked rankings were often 12-24 hours behind actual results, which undermined trust.
Lessons
What the founder learned
Building on scraped data from a platform that doesn't want to be scraped is rented land with an eviction notice. Apple can change their algorithm, block your scrapers, or launch their own keyword tools (App Store Connect already shows some keyword data) at any time. Your product breaks, and you have no recourse. Also, competing with enterprise tools on data quality as a solo developer is structurally impossible — they have budgets for infrastructure, data science teams, and direct relationships with Apple. The indie developer ASO market wants cheap tools, but cheap tools can't afford the data infrastructure needed to be accurate. That's a fundamental economic mismatch.
What they’d do differently
I wouldn't build an ASO tool. The data dependency problem is structural and unsolvable at indie scale. If I wanted to serve indie iOS developers, I'd build tools that use data the developer already has — App Store Connect analytics (Apple provides this via API), crash reports, user feedback analysis, or A/B testing for screenshots and descriptions. Those don't require scraping and Apple can't break them by changing an algorithm.
Editorial scorecard
How viable is rebuilding this today?
Did real users or customers want this?
How well was it built and shipped?
Did they have a path to reach users?
Was the business model viable?
How useful is this postmortem for other builders?
Scores are assigned by App Graveyard editors after review. They are directional, not scientific.
Rebuild opportunity
2/10ASO as a category is consolidating around a few well-funded players. The indie developer tool opportunity is in areas Apple actively supports: App Store Connect API-based analytics, TestFlight automation, review management, and subscription optimization. Tools that work with Apple's data rather than scraping against Apple's wishes are structurally more defensible.
Related postmortems
Built something that didn't work out?
Every failed app has a lesson. Submit yours and help the next builder avoid your mistake. Anonymous submissions welcome.