Results Are In – State of Mobile Jobs Survey 2022!

See how you compare to your mobile developer peers by viewing the results of our State of Mobile Jobs Survey 2022! By Lauren Elizabeth.

Save for later
Share

Contents

Hide contents

Earlier this year, we started a project brand new to us: A survey designed to demystify what mobile developers are worth on the job market, based on real data — provided by the experts: you!

We’re happy to announce that the survey results are in — and we even have an interactive dashboard here where you can play around with different views of the data and filter to see where you fit in the industry.

How the Survey Started

We embarked upon this survey (see original article here) to answer a simple question most of our developers were already asking:

“If I were offered a job tomorrow, how would I know my worth as a mobile app developer according to my location, demographic, and experience level? What would a fair total compensation and benefits package look like?”

As it turns out, there were no sources particular to mobile developers in particular; if there were sources, they were so broad across mobile dev as to be nearly meaningless. And so we created our State of Mobile Jobs Survey! By reaching out to leaders in the community, posting on niche developers spaces, and reaching out to different social hubs, we wound up collecting responses from 1,257 people.

Now, here’s what went into creating the survey, for those who are curious…

How We Built The Survey

We had a lot of questions to answer as we created this survey, including answering how we would use this survey as a company. In order to help the most developers possible, and to avoid bias, we decided not to monetize or incentivize this survey. That meant no paywall to view the responses, and no advertisements for our services on the dashboard. It also meant that we wouldn’t be able to offer incentives for filling out the survey. That’s right, those who answered our survey did for the betterment of the community! 👏👏👏

Choosing which questions went into the survey, the phrasing, the answers offered, and the final design was a whole-company project, with team members from all different disciplines and nationalities contributing.

While the survey was collecting results, we relied on our wonderful team of creators, known personalities in the space, our marketing team, and individual posts on various social channels to get the word out.

Finally, once the survey closed, we worked with a freelance data visualizer to create the interactive portion of our survey on QuickSight as the best vehicle for the results.

Creating, promoting, and dashboarding a survey is labor intensive! We’ve been thrilled to learn a few lessons along the way, which we hope to apply to future surveys in the coming years to continue to benefit our amazing community.

Lessons Learned

Any project, even those with well-undertsood domains, can bring delightful train wrecks along the way. We weren’t immune to those in this project. Here are some interesting caveats to keep in mind as you view the dashboard:

  • The mobile developer community is small (or shy): While we know the mobile dev profession is somewhat niche, we hoped for a larger sample size than what we gathered. 1,257 is a great range, but the lack of incentivization was noticed in the feedback and might have contributed to a lower response rate. We carefully weighed the ability of survey incentives to get more responses against the likelihood of skewing the results toward developers gaming a system to get the prize, though, and felt it was a worthy gamble to rely on goodwill. Hopefully if we offer this survey next year, we can build off this number.
  • International language is difficult: In phrasing our questions, offering answers or even deciding which topics to ask about, we found a medium-sized divide in what was acceptable to ask depending on which country it was being asked in. During creation and after release, we received feedback like: only Americans really care about race and ethnicity, the correct term for being physically disabled varies by country, and the way we phrased education levels was US-centric.
  • Stock options = very important: Even in a team of developers, we missed this aspect of compensation until after release. It wasn’t until 300 responses had been recorded that this question was offered in the survey to capture this range of compensation (and we’re glad we did!). Comparing the overall total with the result on the dashboard averaging only those responses provided, we discovered that the missing responses from the earlier results did not significantly impact the results.
  • Personal identifiers are complicated: We chose to include questions relating to whether someone was transgendered, their mental health, their sexual orientation and other personal aspects of survey respondents. We received a lot of feedback on this, including: transgender people may no longer identify as such after they transition (so this isn’t the best question to ask), those in the LGBTQIA+ community are tired of being targeted for surveying (very fair) and those on the autism spectrum are increasingly self-diagnosed and experience enough imposter syndrome that they question if they should check the box or not (we’re not sure what we’d do in this situation either).
  • Douglas Adams was right about deadlines: Originally we had hoped to post this right on September 15th, but we felt that deadline whoosh past us — to our intense dismay. We prioritized making sure the dashboard was accessible and clear to benefit our members over hitting ‘publish’ as soon as we could. Our data handling stage took a bit longer than we thought.
  • Offering open answer fields might not have been the best decision: In most questions, we offered an ‘other’ field in addition to set responses. We wound up having to exclude some answers because they were too varied and too far outside the scope of other responses to represent with charts and graphs. Furthermore, doing so would have been too identifying. That’s why you may see some discrepancy in the total responses on the results. Our favorite was when someone listed their total years in the industry as 50,000!
  • Not every question got an answer: Of our 1,225 respondents used to create our dashboard, only 289 chose to answer every question. That’s a completion rate of 23.6%!

This is not a full list of everything we would do differently, but we hope you’ve at least enjoyed the storytelling so far. Now, on to the interesting takeaways we spotted! 👇