Smart Contract Development
  • Introduction
    • What is a Transaction
    • Accounts and Signing
    • What is a smart contract
  • Learning Solidity
    • Introduction
    • Module 1
      • Variable Types
      • Variable Scope: State & Local variables
      • Global variables
      • Functions
        • View and Pure
        • Shadowing in Fuctions
      • Mapping
      • Require
      • Events
    • Project #1: Simple Registry
    • Module 2
      • Constructor
      • Data Location: Value & Reference
      • Interface
      • Import
        • Importing in Foundry
      • Inheritance
      • ERC-20
      • Checks-effect-interaction pattern
    • Project #2: Basic Vault
    • Module 3
      • Payable
      • Receive
      • Fallback
      • Returns
    • Project #3: ERC20+ETH Wrapper
    • Module 4
      • Immutable and Constant
      • Fixed-point Math
      • Abstract contracts
      • ERC-4626
      • Modifier + Inheritance +Ownable
      • Type
    • Project #4: Fractional Wrapper
    • Module 5
      • If-else
      • Libraries
        • TransferHelper
      • Chainlink Oracle
    • Project #5: Collateralized Vault
  • Compendium
    • Solidity Basics
      • Variable Types
      • Value Types
        • address
        • enum
      • Reference Types
        • strings
        • mappings
        • struct
        • Arrays
        • Multi-Dimensional arrays
      • Global Objects
      • Functions
        • Function types
        • Constructor Function
        • Transaction vs Call
        • Require, Revert, Assert
      • Function signature + selectors
      • Payable
        • Payable + withdraw
        • msg.value & payable functions
      • Receive
      • Fallback function (sol v 0.8)
        • Fallback function (sol v 0.6)
      • call, staticcall, delegatecall
    • Return & Events
    • Control Variable Visibility
    • Local Variables (Storage v Memory)
    • Data Location and Assignment Behaviors
    • Modifiers & Inheritance & Import
      • import styles
    • Interface & Abstract Contracts
    • ABI & Debugging
    • Libraries
    • Conditional(ternary) operators
    • Smart Contract Life-cycle
      • Pausing Smart Contracts
      • Destroying Smart Contracts
    • Merkle Trie and MPT
    • Merkle Tree Airdrop
  • Try & catch
  • Ethereum Signatures
  • EVM, Storage, Opcodes
    • EVM
    • Wei, Ether, Gas
    • Storage
    • ByteCode and Opcodes
    • Transaction costs & Execution costs
  • Reading txn input data
  • Data Representation
  • Yul
    • Yul
      • Intro
      • Basic operations
      • Storage Slots
      • Storage of Arrays and Mappings
      • Memory Operations
      • Memory: how solidity uses memory
      • Memory: Return, Require, Tuples and Keccak256
      • Memory: Logs and Events
      • Inter-contract calls
      • calldata
      • free memory pointer
    • Yul Exercises
      • read state variable
      • read mapping
      • iterate Array, Return Sum
    • memory-safe
  • Upgradable Contracts
    • Upgradability & Proxies
    • UUPS Example
    • Minimal Proxy Example
    • TPP Example
    • 🚧Diamond
      • On Storage
  • Gas Opt
    • Block Limit
    • gasLimit & min cost
    • Solidity Optimiser
    • Memory v calldata
    • Memory caching vs direct storage vs pointers
    • < vs <=
    • reverting early
    • X && Y, ||
    • constant and immutable
    • caching sload into mload
    • Syntactic Sugar
    • using unchecked w/o require
    • Compact Strings
    • Calling a view function
    • Custom errors over require
    • usage of this.
      • multiple address(this)
  • ERCs & EIPs
    • ERC-20.sol
      • Core functions
      • transfer()
      • transferFrom()
      • TLDR transfer vs transferFrom
    • Landing
      • ERC721.sol
      • EIP-721
        • LooksRare
        • Page 1
      • ERC-1271
      • EIP-2981
      • ERC-165
      • EIP-1167: Minimal Proxy Contract
    • VRFConsumerBase
    • UniswapV2Library
  • Yield Mentorship 2022
    • Projects
      • #1 Simple Registry
      • #2 Basic Vault
      • #3 ERC20+ETH Wrapper
        • setFailTransferTrue
      • #4 Fractional Wrapper
      • #5 Collateralized Vault
        • Process
        • Vault.sol
        • Testing
        • Chainlink Oracles
        • Pricing + Decimal scaling
        • Refactor for Simplicity
      • #9 Flash Loan Vault
        • Implementing ERC3156
        • Full code for lender
        • Ex-rate calculation
    • State Inheritance Testing
    • Testing w/ Mocks
    • Yield Style Guide
    • Github Actions
    • TransferHelper.sol
    • math logic + internal fn
    • Interfaces: IERC20
  • Foundry
    • Overview
    • Importing Contracts
    • Testing
      • stdError.arithmeticError
      • assume vs bound
      • Traces
      • label & console2
      • std-storage
  • Smart Contract Security
    • Damn Vulnerable Defi
      • 1. Unstoppable
      • 2. Naive receiver
      • 3. Truster
      • 4. Side Entrance
      • 5. The Rewarder
      • 6. Selfie
      • 7. Compromised
      • 8. Puppet
      • 9. Puppet V2
      • 10 - Free Rider
    • Merkle Tree: shortened proof attack
  • Fixed-Point Math
    • AMM Math
  • Solidity Patterns
    • checks-effects-interactions pattern
    • Router // batch
    • claimDelegate: stack unique owners
    • claimDelegate: cache previous user
  • Array: dup/ascending check
  • Deployment
    • Behind the Scenes
    • Interacting with External Contracts
    • Logging, Events, Solidity, Bloom Filter
  • Misc
    • Mnemonic Phrases
    • Bidul Ideas
  • Archive
    • Brownie Framework
      • Brownie basics
        • storing wallets in .env
        • Deployment to ganache
        • Interacting with contract
        • Unit Testing
        • Testnet deployment
        • Interacting w/ deployed contract
        • Brownie console
      • Brownie Advanced
        • Dependencies: import contracts
        • helpful_scripts.py
        • verify and publish
        • Forking and Mocking
        • Mocking
        • Forking
      • Testing
      • Scripts Framework
        • deploy.py
        • get_accounts
        • deploy_mocks()
        • fund_with_<token>()
      • Brownie Networks
    • Brownie Projects
      • SharedWallet
        • Multiple Beneficiaries
        • Common Code Contract
        • Adding Events
        • Renounce Ownership
        • Separate Files
      • Supply Chain
        • ItemManager()
        • Adding Events
        • Adding unique address to each item
      • Lottery
      • Aave - Lending and Borrowing
        • Approve & Deposit
        • Borrow
      • NFT
      • Advanced Collectible
        • adv_deploy() + Testing
        • Create Metadata
        • Setting the TokenURI
    • node npm
    • Ganache
    • Truffle
    • Remix
    • Installing Env
Powered by GitBook
On this page
  • Setup
  • create_metadata.py
  • image-uri
  • Alternative: Pinata
  1. Archive
  2. Brownie Projects
  3. Advanced Collectible

Create Metadata

Previousadv_deploy() + TestingNextSetting the TokenURI

Last updated 3 years ago

This assumes there already was a previous deployment to rinkeby, with the utilization of AdvancedCollectible[-1]

Setup

  • create new folder "metadata" in brownie project root

  • create new file: sample_metadata.py

  • create new folder within metadata, "rinkeby"

sample_metadata.py
metadata_template = {
    "name": "",
    "description": "",
    "image": "",
    "attributes": [{"trait_type": "cuteness", "value": 100}]
}

create_metadata.py

In our scripts folder, create new file: create_metadata.py

We will use this script to assign the URI for each NFT.

talk about NFT token URI etc - what, how

create_metadata.py
from brownie import AdvancedCollectible, network
from metadata.sample_metadata import metadata_template
from pathlib import Path

breed_mapping = {0: "PUG", 1: "SHIBA_INU", 2: "ST_BERNARD"}

def get_breed(breed_number):
    return breed_mapping[breed_number]


def main():
    advanced_collectible = AdvancedCollectible[-1]
    number_of_NFTS = advanced_collectible.tokenCounter()
    print(f"The number of NFTs minted so far is {number_of_NFTS}")

    for tokenID in range(number_of_NFTS):
        breed = get_breed(advanced_collectible.tokenIdToBreed(tokenID))
        metadata_filename = f"./metadata/{network.show_active()}/{tokenID}-{breed}.json"
    
        collectible_metadata = metadata_template
        if Path(metadata_filename).exists():
            print(f"{metadata_filename} exists! Delete to overwrite")
        else:
            print(f"Creating Metadata file: {metadata_filename}")
            collectible_metadata["name"] = breed
            collectible_metadata["description"] = f"An adorable {breed} pup!"
            print(collectible_metadata)

            image_path = "./img" + breed.lower().replace("_", "-") + ".png"
            image_uri = upload_to_ipfs()
            collectible_metadata["image_uri"] = image_uri
  • call on previous rinkeby deployment

  • check number of NFTs minted

  • for each tokenID, get the breed via getbreed(), and structure the filename and path as metadata_filename

  • collectible_metadata collects the metadata_template to be used as a base for modification

  • We then check if the required metadata_filename exists, using Path library.

  • Else, we will create a new one using the template as reference, adding the following

    • name

    • description

    • image URI -> collectible_metadata["image_uri"] = image_uri

image-uri

How do we get the image URI? Currently the images are in our local img folder. We need them to be hosted on IPFS and pass their IPFS URI into collectible_metadata["image_uri"]

Interestingly, ipfs daemon command works in terminal, command prompt.

On running ipfs daemon on terminal we will see the following:

def upload_to_ipfs(filepath):
    # open image as binary - open(rb)
    with Path(filepath).open("rb") as fp:
        image_binary = fp.read()
        ipfs_url = "http://127.0.0.1:5001"     #get from WebUI 
        endpoint = "/api/v0/add"
        response = requests.post(ipfs_url + endpoint, files={"file":image_binary}) #post request
        ipfs_hash = response.json()["Hash"]     #response returns dictionary
        # "./img/0-PUG.png"  -> split on /, grab last part of array, which is "0-PUG.png"
        filename = filepath.split("/")[-1:][0]
        image_uri = f"https://ipfs.io/ipfs/{ipfs_hash}?filename={filename}"
        print(image_uri)
        return image_uri
def main():
    advanced_collectible = AdvancedCollectible[-1]
    number_of_NFTS = advanced_collectible.tokenCounter()
    print(f"The number of NFTs minted so far is {number_of_NFTS}")

    for tokenID in range(number_of_NFTS):
        breed = get_breed(advanced_collectible.tokenIdToBreed(tokenID))
        metadata_filename = f"./metadata/{network.show_active()}/{tokenID}-{breed}.json"
    
        collectible_metadata = metadata_template
        if Path(metadata_filename).exists():
            print(f"{metadata_filename} exists! Delete to overwrite")
        else:
            print(f"Creating Metadata file: {metadata_filename}")
            collectible_metadata["name"] = breed
            collectible_metadata["description"] = f"An adorable {breed} pup!"
            print(collectible_metadata)
            # convert underscores to dashes to be URI compatible
            image_path = "./img/" + breed.lower().replace("_", "-") + ".png"
            image_uri = upload_to_ipfs(image_path)
         #   collectible_metadata["image_uri"] = image_uri
  • ipfs_url -> retrieved from running ipds daemon

  • endpoint -> as per api endpoint reference

  • response -> structuring and sending a post response (can also use curl)

  • ipfs_hash -> response returns JSON string on successful call to endpoint - decode it as a python dict via .json()

  • extract filename from filepath and together with the IPFS hash, construct the image URI

print(image_uri)

will return an ipfs link, like so:

The image will be available as long as you are running your ipfs node.

To make the data highly available without needing to run a local IPFS daemon 24/7, you can request that a remote pinning service store a copy of your IPFS data on their IPFS nodes.

Creating metadate files

Now that we have filled all the fields in the metadata file, we are going to write it as a json file into our directory.

        with open(metadata_filename, "w") as file:
            json.dump(collectible_metadata, file)
        upload_to_ipfs(metadata_filename)

This will create a json file in our metadata/rinkeby folder upon running:

brownie run scripts/advcollectible/create_metadata.py --network rinkeby

Final code at this point:

from brownie import AdvancedCollectible, network
from metadata.sample_metadata import metadata_template
from pathlib import Path
import requests, json

breed_mapping = {0: "PUG", 1: "SHIBA_INU", 2: "ST_BERNARD"}

def get_breed(breed_number):
    return breed_mapping[breed_number]


def main():
    advanced_collectible = AdvancedCollectible[-1]
    number_of_NFTS = advanced_collectible.tokenCounter()
    print(f"The number of NFTs minted so far is {number_of_NFTS}")

    for tokenID in range(number_of_NFTS):
        breed = get_breed(advanced_collectible.tokenIdToBreed(tokenID))
        metadata_filename = f"./metadata/{network.show_active()}/{tokenID}-{breed}.json"
    
        collectible_metadata = metadata_template
        if Path(metadata_filename).exists():
            print(f"{metadata_filename} exists! Delete to overwrite")
        else:
            print(f"Creating Metadata file: {metadata_filename}")
            collectible_metadata["name"] = breed
            collectible_metadata["description"] = f"An adorable {breed} pup!"
            print(collectible_metadata)
            # convert underscores to dashes to be URI compatible
            image_path = "./img/" + breed.lower().replace("_", "-") + ".png"
            image_uri = upload_to_ipfs(image_path)
            collectible_metadata["image_uri"] = image_uri
            
            with open(metadata_filename, "w") as file:
                json.dump(collectible_metadata, file)
            upload_to_ipfs(metadata_filename)


def upload_to_ipfs(filepath):
    # open image as binary - opne(rb)
    with Path(filepath).open("rb") as fp:
        image_binary = fp.read()
        ipfs_url = "http://127.0.0.1:5001"     #get from WebUI 
        endpoint = "/api/v0/add"
        response = requests.post(ipfs_url + endpoint, files={"file": image_binary})       #post request
        ipfs_hash = response.json()["Hash"]     #response returns dictionary. 
        # "./img/0-PUG.png"  -> split on /, grab last part of array, which is "0-PUG.png"
        filename = filepath.split("/")[-1:][0]
        image_uri = f"https://ipfs.io/ipfs/{ipfs_hash}?filename={filename}"
        print(image_uri)
        return image_uri   

11:33 - Refactor to check if file has already been uploaded to IPFS, so we don't upload every single time. (check github for code)

Alternative: Pinata

deploy to Pinata -> upload_pinata.py

  • create account at Pinata and generate api key

  • 11:20 - 11:31

You should also see the pinned file:

To that end, we will have to install IPFS cli:

We will be working with /api/v0/add to add our file to IPFS:

More on JSON:

Reference:

https://docs.ipfs.io/how-to/command-line-quick-start/
https://docs.ipfs.io/reference/http/api/#http-rpc-commands
https://www.usna.edu/Users/cs/nchamber/courses/forall/s20/lec/l25/#:~:text=JSON%20at%20its%20top%2Dlevel,%2C%20other%20dictionaries%2C%20and%20lists.
https://ipfs.io/ipfs/QmUPjADFGEKmfohdTaNcWhp7VGk26h5jXDA7v3VtTnTLcW?filename=st-bernard.png
https://docs.pinata.cloud/api-pinning/pin-file
metadata_filenamecollectible_metadata
reponse returns a dictionary