Building smart contract security tools using MythX

This section of the guide is aimed at developers who want build security tools using the MythX API.


What Tools to Build?

Using MythX API, you can build security tools that find bugs in smart contracts for Ethereum or compatible blockchains such as Tron and Quorum. Currently, we support Solidity code and EVM bytecode, but we are working on support for additional bytecode formats (e.g. eWASM) and languages.

Some examples for potential MythX tools are:

  • Plugins for dapp development environments, such as Truffle, Remix and Embark;
  • Plugins for code editors (Sublime Text, Atom, vim,…);
  • Apps and CI hooks for code repositories (GitHub, Gitlab);
  • Command-line tools for security auditors;
  • Integrations into dapp browsers and wallets;
  • Standalone web interfaces.

Feel free however to experiment with the API in any way you want!

API Specification

Besides whatever you might find in this guide, the MythX OpenAPI Spec is the ultimate authority. Beyond that be dragons.

Language Bindings

In most cases you’ll want to use an existing client library that abstracts the low-level details of interacting with MythX.

API Walkthrough

The MythX API curl scripts demonstrate interaction with MythX API at the most basic level. The scripts will show you the HTTP requests that get sent as well with the JSON output returned as the result of each request.

The process for analyzing a smart contract works as follows:

  • Authenticate with Ethereum address and password to retrieve an access token;
  • Submit a contract for analysis, creating a job run with a UUID;
  • See the status of job using the UUID of a previously submitted analysis;
  • Get the results of a previously finished analysis using the UUID.

Let’s run through a basic example. Make sure that curl is installed and clone the GitHub repository to get started:

$ git clone
$ cd mythx-api-curl

To verify that you can connect to the API, run the api-version script:

$ ./
  Running: curl -v GET
  curl completed sucessfully. See /tmp/curljs.err53890 for verbose logs.
  Processed output from /tmp/curljs.out53890 follows...
    "api": "v1.3.3",
    "maru": "0.3.4",
    "mythril": "0.20.0",
    "harvey": "0.0.7"


MythX uses JSON Web Token (JWT) authentication. In this authentication scheme, the user submits their username and password to the login endpoint. Upon successful login the server returns the following:

  • A timed access token. This token is needs to be sent by the client with every request to access the API.
  • A refresh token that can be used to request a new access token once the current one times out.

To execute the login process on the shell, set the MYTHX_PASSWORD to and MYTHX_ETH_ADDRESS environment variables to your Ethereum address and API password:

$ export MYTHX_PASSWORD=xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx
$ export MYTHX_ETH_ADDRESS=0x.............

Then run the login script as follows:

$ . ./
Successfully logged into MythX

If authentication succeeds, the login script will set the MYTHX_ACCESS_TOKEN and MYTHX_REFRESH_TOKEN environment variables.


With these variables set you can submit security analysis jobs.

Submitting an Analysis Job

To launch a security analysis job the client needs to submit the Solidity source code and EVM bytecode via a POST request to the analyses endpoint. The input format is similar to the JSON output generated by solc and Truffle. Here is an example:

   "clientToolName": "mythx-api-curl",
      "contractName" : "Token"
      "deployedBytecode" : "6080605482019055600192(...)",
      "deployedSourceMap" : "25:503:0:-;;;;6202:0;(...)",
      "bytecode" : "608060405234801561001(...)",
      "sourceMap" : "25:503:0:-;;;;8:9:-1(...)",
      "analysisMode" : "quick",
      "sourceList" : [
      "sources" : {
         "Token.sol" : {
            "source" : "pragma solidity ^0.5.0;\n\ncontract Token {\n\n  mapping(address => uint) balances;\n  uint public totalSupply;\n\n  constructor(uint _initialSupply) public {\n    balances[msg.sender] = totalSupply = _initialSupply;\n  }\n\n  function transfer(address _to, uint _value) public returns (bool) {\n    require(balances[msg.sender] - _value >= 0);\n    balances[msg.sender] -= _value;\n    balances[_to] += _value;\n    return true;\n  }\n\n  function balanceOf(address _owner) public view returns (uint balance) {\n    return balances[_owner];\n  }\n}\n"
      "mainSource: "Token.sol"

The input JSON contains the following fields:

  • clientToolName: A name that uniquely identifies your MythX tool.
  • data: A dictionary containing the data of the contract to be analyzed
  • contractName: The name of the main contract class to be analyzed.
  • deployedBytecode: The runtime bytecode of the contract.
  • deployedSourceMap: The source mapping generated by solc for the runtime bytecode.
  • bytecode: The creation bytecode of the contract.
  • sourceMap: The source mapping generated buy solc for the creation bytecode.
  • analysisMode: The type of analysis (“quick” of “full”).
  • sourceList: A list of source files referred to by the source maps.
  • sources: A dictionary containing the original source code of each code file.
  • mainSource: The filename of the Solidity file containing the main contract class.

Note that both source code and bytecode must be submitted to receive complete results.

The analysisMode field is used to select the type of analysis. Currently two modes are supported:

  • quick: Perform static analysis and shallow symbolic analysis and input fuzzing. Returns a result within 90 seconds.
  • full: Perform static analysis and deep symbolic analysis and input fuzzing. May run for up to 2 hours.

About sourceList and sources

The sourceList must contain all files that were used for compiling the contract - i.e. the main Solidity file as well as imports (and imports of imports, etc.). Each filename listed in sourceList must have a matching entry in the sources dict. Note that the order of the files in the sourceList field is crucial for receiving correct issue locations and should match the order that the compiler used to build the source maps.

Assembling the sourceList correctly is currenly difficult for complex projects - we’re working on an easier way that allows ASTs to be submitted instead. In the meantime, you can check out the code of Sabre which implements support for imports.

If successful, the API returns a UUID that can then be used to retrieve the status and results of the analysis.

$ ./ sample-json/Token.json
  (with MYTHX_API_KEY and EVM bytecode)
curl completed successfully. Output follows...
HTTP/1.1 200 OK
  "result": "Queued",
  "uuid": "bf9fe267-d322-4641-aae2-a89e62f40770"

Polling the API to Obtain Job Status

You can determined the status of your analysis by sending a GET request to /analyses/<UUID>:

$ ./ bf9fe267-d322-4641-aae2-a89e62f40770
Issuing HTTP GET
  (with MYTHX_API_KEY)
curl completed successfully. Output follows...
HTTP/1.1 200 OK
  "result": "Finished",
  "uuid": "bf9fe267-d322-4641-aae2-a89e62f40770"
  • Queued: Your job is in the queue but has not been started yet. Note that you can queue up to 10 jobs at a time.
  • In Progress: Your job is currently running. In quick mode, the job will remain in running state for approximately one minute. In full mode, the analysis may take up to two hours depending on the complexity of the code.
  • Finished: The job has been completed successfully and the results can be retrieved.

Estimating Analysis Duration

How long an analysis takes to complete depends on the analysis mode (“quick” or “full”) and overall API load.

Analysis mode

  • In “quick” mode, the analysis finishes takes between 30 seconds 5 minute to complete after entering the “in progress” state.
  • In “full” mode, the analysis may take up to 2-5 hours. Note that “full” mode is still highly experimental.

Overall API load

We aim to process all incoming requests immediately. However, in times of high load, our jobs might remain in the queue for some time before a worker becomes available.

Polling Recommendations

In order to help users keep below their rate limits (and not overload the API too much), we recommend implementing the following polling algorithm:

  • Set an initial delay before sending the first poll after submitting an analysis. Even in quick mode, results rarely are ready in less than 45 seconds. In full mode, results will usually only be ready after 2 hours or more.
  • After the initial delay has passed, poll in reasonable regular intervals (e.g. every 8 seconds in “quick” mode and once per minute in “full” mode). Alternatively, start with a low timeout that increases geometrically over time.
  • Also set an overall timeout. If a job has been in “in progress” state for more than 12 hours, it is reasonable to assume that there’s a problem on the API side and return an error message to the user. Include the job UUID in the error message. API bugs can be submitted via one of our support channels

Obtaining Analysis Results

Once the job is in “finished:” state the result can be obtained using as follows:

$ ./ 8a15c859-3245-4d73-bdef-77bf53c5b9b2
Issuing HTTP GET

curl completed sucessfully. See /tmp/curljs.err54326 for verbose logs.
Processed output from /tmp/curljs.out54326 follows...
    "issues": [
        "swcID": "SWC-101",
        "swcTitle": "Integer Overflow and Underflow",
        "description": {
          "head": "The binary subtraction can underflow.",
          "tail": "The operands of the subtraction operation are not sufficiently constrained. The subtraction could therefore result in an integer underflow. Prevent the underflow by checking inputs or ensure sure that the underflow is caught by an assertion."
        "severity": "High",
        "locations": [
            "sourceMap": "296:29:0"
        "extra": {}
    "sourceType": "solidity-file",
    "sourceFormat": "text",
    "sourceList": [
    "meta": {
      "coveredInstructions": 213,
      "coveredPaths": 10,
      "error": "",
      "selected_compiler": "0.5.0",
      "warning": []

The output contains a list of issues with title, short description and long description and source mappings, as well as additional information:

  • The swcID field contains a reference to the Smart Contract Weakness Classification Registry.
  • The description field describes the found issue in two entries.
    • head is the summary of the found issue.
    • tail contains the details on what causes the issue well as any possible remediation.
  • The locations list contain one or more solc-style sourceMap entries that contain bytecode offsets into the provided source code files. This source mapping format is described in the solc documentation. The notation s:l:f where s is the byte-offset to the start of the range in the source file, l is the length of the source range in bytes and f is the index of the source code file in the sourceList.
  • The meta field contains meta information about the analysis run.

We recommend that users submit both bytecode and source code to obtain a full analysis. If only the creation bytecode is given, and not the source code, MythX will return a result like the following:

    "issues": [
    "sourceType": "raw-bytecode",
    "sourceFormat": "evm-byzantium-bytecode",
    "sourceList": [
    "meta": {
      "coveredInstructions": 111,
      "coveredPaths": 5

In this instance, sourceList: [0x98...] refers to the Keccak256 hash of the runtime bytecode within which the issue(s) were found.

API Details

Token Expiration Times

Validity times for the JSON Web Tokens are set as follows:

  • Access tokens are valid for 10 minutes;
  • Refresh tokens are valid for 4 weeks.

Rate Limits

API rate limits need to be considered when designing MythX tools as sending excessive requests may cause API errors. Currently the following rate limits apply:

  • The client can submit up to 2 requests per second.
  • The API can queue up to 10 analysis jobs per client. However, a maximum of four workers will be allocated to a single client. It is therefore recommended to limit the number of parallel analysis jobs to four.
  • The client can perform up to 10,000 API requests within 24 hours.

Compiler Settings

It is recommended to activate optimization when compiling source code for analysis. This reduces the complexity of the bytecode, allowing for better performance in the fuzzing and symbolic analysis steps and increasing code coverage.

For example, when using solcjs, add the following to the compiler settings:

settings: {
    optimizer: {
      enabled: true,
      runs: 200

Example Code

Sabre is a minimal MythX CLI written in JavaScript. It shows how to compile a Solidity file using solc-js and submit the compiler output to MythX using the armlet JavaScript library.

Revenue Sharing Program

Once paid subscription plans for MythX go live, we’ll share back some of the revenue from subscription fees back to tool builders.

The amount revenue share you receive will depend on the number of daily active paying users of your tool. In your tool, set the clientToolName field to a unique name of your choice when submitting analysis requests. That way we can keep track of usage statistic.

More details about this program will be announced during the beta in 2019.