create account

[EN/PT-BR] Python on holiday and joy in advancing even further in programming with Hive! by shiftrox

View this thread on: hive.blogpeakd.comecency.com
· @shiftrox · (edited)
$9.50
[EN/PT-BR] Python on holiday and joy in advancing even further in programming with Hive!
<div class="text-justify">

<center>![image.png](https://files.peakd.com/file/peakd-hive/shiftrox/23uQP16uRLtTXkoTUiCauF7UuvNFDqGC3deZRaRD53WSwX1Tp86FNrT2iRLuq1RCoxs4d.png)
[Brasilcode](https://www.brasilcode.com.br/13-motivos-para-aprender-python/)</center>

![divisor.jpeg](https://files.peakd.com/file/peakd-hive/shiftrox/23x1ckYW5T1yBx8C6Rd18NYtNsYsNdTGDvWitn6Bsn1q5DyNyNy6dyVhLcGh4zUN1vwui.jpeg)

It's been a while since I've programmed anything in code to help me or just play around with Hive. My last project was an offline project that formatted small texts for the inleo standard, threads (from meta) and for X. Each one with its own way of using tags or text formatting commands.

In addition, it already translated from Brazilian Portuguese to English, making it easier to copy and paste wherever needed. I stopped working on the project because I ended up losing the source code along with one of my SSDs, but at the moment I can even do better, after all, this program called "Free Translator" was practically offline, it just accessed an API or URL (I don't remember) that did the translation.

<center>![image.png](https://files.peakd.com/file/peakd-hive/shiftrox/EobzjubTQXb55Eibc8rYHmsniX6NTHoxvucDDLAkMUMHcGv9WmWfMrhatzuZfMYe2dm.png)</center>

Taking advantage of today's holiday, I wanted to play around a bit and create something I've been wanting to do for a while: access to HiveSQL and Hive Engine to get my total dividends (everything that came in as Hive tokens) and, along with that, calculate the APR on top of the L2 tokens.

This was crazy, but now I'm more relaxed, because I finally managed to do it! Just 3 Python files, a lot of ChatGPT and help from two great friends: @mengao and @gwajnberg and with that, I can already say that I have a first version done.

<center>![image.png](https://files.peakd.com/file/peakd-hive/shiftrox/23swibVedaRCRaXyDguxPykaaL8sbxcbmZskYLqef7CjT6cN5fHWLv9QF3Ey4TmwWBZGr.png)</center>

As I said, there is no turning back when it comes to using artificial intelligence. I probably would never have done this just by studying and searching on Google. So, I didn't keep track of the hours, but I probably spent about 3 hours last night, from 11 pm to 2 am messing around with no success. And today probably another 7 or 8 hours, starting at 9 am and finishing now around 3 pm, I didn't even have lunch to tell you the truth haha.


I started by doing:

```
pip install hiveengine
pip install beem
pip install nectarengine

```

And nothing, none of them would install, the result was a problem with OpenSSL, which I had already installed in version 3.5 and everything else. Oddly enough, I had to use ChatGPT with all my strength and that's how I discovered that I needed version 1.1.1 of OpenSSL to be able to install any of these packages/libraries.

After that, phew, I took a deep breath and was left with just this command:

```
pip install hiveengine

```

With that, everything was very simple, a main file to start and call the functions and then print the file in the terminal:

```
import Conexao as conexao
import CmdHiveEngine as hiveengine

from tabulate import tabulate

# Mapeamento de perfil_envio para token
perfil_para_token = {
    "armero": "ARMERO",
    #"bee.drone": "BEE",
    "beeswap.fees": "BXT",
    "dab-treasury": "DAB",
    "dcityfund": "SIM",
    "duo-sales": "DUO",
    "eds-pay": "EDSI",
    "lgndivs": "LGN",
    "pakx": "PAKX",
    "tokenpimp": "PIMP",
    "vetfunding": "CAV"
}

try:
    
    Campos = "UPPER([FROM]) AS perfil_envio," + "\r\n"
    Campos += "SUM(amount) AS week_income" + "\r\n"
    Tabelas = "TxTransfers tt" + "\r\n"
    Criterio = "[TO] = 'shiftrox'" + "\r\n"
    Criterio += "AND [type] NOT IN ('transfer_from_savings','transfer_to_savings')" + "\r\n"
    Criterio += "AND tt.[timestamp] >= DATEADD(day, -7, GETDATE())" + "\r\n"
    Criterio += "AND tt.amount_symbol = 'HIVE'" + "\r\n"
    Criterio += "AND tt.[FROM] <> 'shiftrox'" + "\r\n"
    Criterio += "GROUP BY [FROM]" + "\r\n"
    Ordem = "UPPER([FROM])"+ "\r\n"

    rsResultado = conexao.RetReg(Campos, Tabelas, Criterio, Ordem)

    if rsResultado and isinstance(rsResultado[0], dict):
        
        dblMenorPreco : float = 0.0
        dblTotal : float = 0.0
        intQtdeSemanas : int = 52

        # Extrair os cabeçalhos (nomes das colunas) do primeiro dicionário
        headers = list(rsResultado[0].keys())  # Converte dict_keys para lista
        
        # Adicionar a nova coluna que será manipulada manualmente
        headers.append("HOLDING")
        headers.append("HIVE_PRICE")
        headers.append("HIVE_VALUE")
        headers.append("APR")

        for item in rsResultado:
            
            dblTokenBalance = 0
            dblMenorPreco = 0

            #buscando holding, stake e menor preco
            token = perfil_para_token.get(item["perfil_envio"].lower(), "nenhum")

            if token != "nenhum":
                
                dblTokenBalance = hiveengine.get_token_balance("shiftrox", token)

                dblMenorPreco = hiveengine.obter_menor_preco_venda(token)

            else:
                dblTokenBalance = 0
            #buscando holding e stake

            # Calcular valores numéricos
            holding = float(dblTokenBalance)
            hive_price = float(dblMenorPreco)
            hive_value = holding * hive_price

            # Guardar valores numéricos para cálculos
            item["HOLDING"] = holding
            item["HIVE_PRICE"] = hive_price
            item["HIVE_VALUE"] = hive_value

            item["APR"] = ((float(item["week_income"]) * intQtdeSemanas) / hive_value) * 100 if hive_value != 0 else 0.0

            # Agora formatar os valores apenas para exibição (ex: tabulate)
            item["HOLDING"] = f"{holding:,.8f}"
            item["HIVE_PRICE"] = f"{hive_price:,.8f}"
            item["HIVE_VALUE"] = f"{hive_value:,.8f}"
            item["APR"] = f"{item['APR']:.2f}%"

            dblTotal += float(item["week_income"])

        # Criar linha de total
        total_row = {
            "perfil_envio": "Total:",
            "week_income": f"{dblTotal:.3f}",  # formatado com 3 casas decimais
            "APR": ""
        }

        # Adicionar a linha ao final da lista
        rsResultado.append(total_row)

        print(tabulate(rsResultado, headers="keys", tablefmt="grid"))
    
    else:
        print("Nenhum resultado retornado ou formato não suportado.")

except Exception as erro:
    print("Erro Principal =>", erro)
```

And so some APRs were calculated. I thought it was great to finally be able to access more layers of Hive to create a program that will help me a lot to understand how my dividend gains are going. Of course, I still need to improve it, it only searches for tokens that are liquid and some are calculated only with them in stake, so the next step will be to improve this.

Some things related to formatting also need to be improved, I want to put everything with 8 decimal places, which is the maximum that appears in the portfolios, and thus align it as much as possible with them. I also need to translate everything into English to make it more eye-catching and improve some small points.

<center>![image.png](https://files.peakd.com/file/peakd-hive/shiftrox/EnyoXowYFPrm7hVQLp5XrK8mTW9zev7PYiMC3jhX76AC6iY2TcYHDPyexodeSbZh7de.png)</center>

There is also a totalizer, so in this case in the last 7 days I earned a total of 20.75 Hive just with dividends and transfers from other profiles. I am not interested in setting up something in HBD at the moment, especially because none of the tokens I have pays in HBD.

One thing that has always left me in doubt is regarding the filter date. In my mind I should always take 7 days, but I don't think I should count today, but I took a look at https://hivestats.io/@shiftrox and saw that their 7-day option filters exactly (Today - 7 days), so they take for example everything that happened on 04/25/2025 until today, 05/01/2025).

```
Criterion += "AND tt.[timestamp] >= DATEADD(day, -7, GETDATE())" + "\r\n"

```

I don't know if this is good for dividends, maybe I'll change it to consider (yesterday - 7 days), that is, work with days that are already "closed", but anyway, there's still a lot to improve.

Finally, the code to find the amount of tokens I have and their lowest market value was:

```
import requests

from hiveengine.api import Api
#from hiveengine.market import Market

def obter_menor_preco_venda(token_symbol):
    try:

        api = Api()

        sell_orders = api.find("market", "sellBook", query={"symbol": token_symbol})
        
        if not sell_orders:
            #return f"Não há ordens de venda para o token {token_symbol} no momento."
            return 0

        menor_preco = min(sell_orders, key=lambda x: float(x['price']))

        dblMenorPreco = str(menor_preco['price'])

        return dblMenorPreco
    
    except Exception as e:
        print(f"Ocorreu um erro: {e}")
        return 0
    
def get_token_balance(usuario: str, token: str) -> float:
    
    url = "https://api.hive-engine.com/rpc/contracts"

    payload = {
        "jsonrpc": "2.0",
        "id": 1,
        "method": "find",
        "params": {
            "contract": "tokens",
            "table": "balances",
            "query": {
                "account": usuario,
                "symbol": token.upper()
            },
            "limit": 1
        }
    }

    try:
        response = requests.post(url, json=payload)
        response.raise_for_status()
        result = response.json().get("result", [])

        if result:
            return float(result[0].get("balance", 0.0))
        else:
            return 0.0

    except requests.RequestException as e:
        print(f"Erro ao acessar a API: {e}")
        return 0.0

```

get_token_balance() came out as a URL search, then I want to see if I can find it via the hiveengine library, just like the get_lowest_sale_price() function was created, to centralize everything using the API.

Anyway, the first steps have been taken, now it's time to burn neurons to do more! First you start and then you improve!

![sun_divisor.webp](https://images.hive.blog/DQmXEDLx7R7Jy8fHF1tA7sPyb4A9QBvmj3y4FjFhephcPCS/sun_divisor.webp)

<center>![image.png](https://files.peakd.com/file/peakd-hive/shiftrox/23uQP16uRLtTXkoTUiCauF7UuvNFDqGC3deZRaRD53WSwX1Tp86FNrT2iRLuq1RCoxs4d.png)
[Brasilcode](https://www.brasilcode.com.br/13-motivos-para-aprender-python/)</center>

![divisor.jpeg](https://files.peakd.com/file/peakd-hive/shiftrox/23x1ckYW5T1yBx8C6Rd18NYtNsYsNdTGDvWitn6Bsn1q5DyNyNy6dyVhLcGh4zUN1vwui.jpeg)


Já faz um bom tempo que não programo nada em código para me auxiliar ou apenas brincar juntamente com a Hive. Meu ultimo projeto foi um offline que formatava pequenos textos para o padrão da inleo, threads (da meta) e para o X. Cada um com sua respectiva forma de usar tags ou comandos de formatação de texto.

Além de que ele já fazia a tradução do português do Brasil para o inglês, facilitando assim poder copiar e colar aonde fosse preciso. Parei de mexer no projeto porque acabei perdendo o código fonte juntamente com um SSD meu, mas no momento posso até fazer melhor, afinal, esse programa chamado de "Tradutor Livre" era praticamente offline, apenas acessava uma API ou URL (não lembro) que fazia a tradução.

<center>![image.png](https://files.peakd.com/file/peakd-hive/shiftrox/EobzjubTQXb55Eibc8rYHmsniX6NTHoxvucDDLAkMUMHcGv9WmWfMrhatzuZfMYe2dm.png)</center>

Aproveitando o feriado de hoje, quis brincar um pouco e criar algo que sentia necessidade de fazer há algum tempo: acesso ao HiveSQL e Hive Engine, para buscar o meu total de dividendos (tudo que entrou de token Hive) e juntamente com isso, calcular o APR em cima dos tokens da L2.

Muito maluco isso, mas agora, estou mais tranquilo, pois finalmente consegui fazer! Apenas 3 arquivos em Python, muito ChatGPT e ajuda de dois grandes amigos: @mengao e @gwajnberg e com isso, já posso dizer que tem uma primeira versão feita.

<center>![image.png](https://files.peakd.com/file/peakd-hive/shiftrox/23swibVedaRCRaXyDguxPykaaL8sbxcbmZskYLqef7CjT6cN5fHWLv9QF3Ey4TmwWBZGr.png)</center>

É como digo, não tem mais volta no uso de inteligência artificial. Eu provavelmente nunca teria feito isso aqui somente com estudo e buscando no google. Então, não marquei as horas, mas provavelmente gastei umas 3 horas ontem a noite, fiquei das 11 da noite até as 2 da manhã mexendo e sem sucesso. E hoje provavelmente mais umas 7 ou 8 horas, começando as 9 da manhã e finalizando agora por volta das 3 da tarde, eu nem almocei para falar a verdade haha.

Comecei fazendo:

```
pip install hiveengine
pip install beem
pip install nectarengine

```

E nada, nenhum deles instalava, o retorno era problema com OpenSSL que já havia instalado na versão 3.5 e tudo mais. Por incrível que pareça, precisei o usar o ChatGPT com todas as forças e assim descobri que precisava da versão 1.1.1 do OpenSSL para poder instalar qualquer um destes pacotes/bibliotecas.

Depois disso, ufa, respirei fundo e fiquei apenas com este comando:

```
pip install hiveengine

```

Com isso era tudo bem simples, um arquivo principal para iniciar e chamar as funções e depois realizar o print no terminal:


```
import Conexao as conexao
import CmdHiveEngine as hiveengine

from tabulate import tabulate

# Mapeamento de perfil_envio para token
perfil_para_token = {
    "armero": "ARMERO",
    #"bee.drone": "BEE",
    "beeswap.fees": "BXT",
    "dab-treasury": "DAB",
    "dcityfund": "SIM",
    "duo-sales": "DUO",
    "eds-pay": "EDSI",
    "lgndivs": "LGN",
    "pakx": "PAKX",
    "tokenpimp": "PIMP",
    "vetfunding": "CAV"
}

try:
    
    Campos = "UPPER([FROM]) AS perfil_envio," + "\r\n"
    Campos += "SUM(amount) AS week_income" + "\r\n"
    Tabelas = "TxTransfers tt" + "\r\n"
    Criterio = "[TO] = 'shiftrox'" + "\r\n"
    Criterio += "AND [type] NOT IN ('transfer_from_savings','transfer_to_savings')" + "\r\n"
    Criterio += "AND tt.[timestamp] >= DATEADD(day, -7, GETDATE())" + "\r\n"
    Criterio += "AND tt.amount_symbol = 'HIVE'" + "\r\n"
    Criterio += "AND tt.[FROM] <> 'shiftrox'" + "\r\n"
    Criterio += "GROUP BY [FROM]" + "\r\n"
    Ordem = "UPPER([FROM])"+ "\r\n"

    rsResultado = conexao.RetReg(Campos, Tabelas, Criterio, Ordem)

    if rsResultado and isinstance(rsResultado[0], dict):
        
        dblMenorPreco : float = 0.0
        dblTotal : float = 0.0
        intQtdeSemanas : int = 52

        # Extrair os cabeçalhos (nomes das colunas) do primeiro dicionário
        headers = list(rsResultado[0].keys())  # Converte dict_keys para lista
        
        # Adicionar a nova coluna que será manipulada manualmente
        headers.append("HOLDING")
        headers.append("HIVE_PRICE")
        headers.append("HIVE_VALUE")
        headers.append("APR")

        for item in rsResultado:
            
            dblTokenBalance = 0
            dblMenorPreco = 0

            #buscando holding, stake e menor preco
            token = perfil_para_token.get(item["perfil_envio"].lower(), "nenhum")

            if token != "nenhum":
                
                dblTokenBalance = hiveengine.get_token_balance("shiftrox", token)

                dblMenorPreco = hiveengine.obter_menor_preco_venda(token)

            else:
                dblTokenBalance = 0
            #buscando holding e stake

            # Calcular valores numéricos
            holding = float(dblTokenBalance)
            hive_price = float(dblMenorPreco)
            hive_value = holding * hive_price

            # Guardar valores numéricos para cálculos
            item["HOLDING"] = holding
            item["HIVE_PRICE"] = hive_price
            item["HIVE_VALUE"] = hive_value

            item["APR"] = ((float(item["week_income"]) * intQtdeSemanas) / hive_value) * 100 if hive_value != 0 else 0.0

            # Agora formatar os valores apenas para exibição (ex: tabulate)
            item["HOLDING"] = f"{holding:,.8f}"
            item["HIVE_PRICE"] = f"{hive_price:,.8f}"
            item["HIVE_VALUE"] = f"{hive_value:,.8f}"
            item["APR"] = f"{item['APR']:.2f}%"

            dblTotal += float(item["week_income"])

        # Criar linha de total
        total_row = {
            "perfil_envio": "Total:",
            "week_income": f"{dblTotal:.3f}",  # formatado com 3 casas decimais
            "APR": ""
        }

        # Adicionar a linha ao final da lista
        rsResultado.append(total_row)

        print(tabulate(rsResultado, headers="keys", tablefmt="grid"))
    
    else:
        print("Nenhum resultado retornado ou formato não suportado.")

except Exception as erro:
    print("Erro Principal =>", erro)
```

E assim calculado alguns APR. Achei sensacional isso de conseguir finalmente acessar mais camadas da Hive para fazer um programa que irá me ajudar bastante a entender como está os meus ganhos de dividendos. Claro que ainda preciso melhorar, está buscando apenas os tokens que estão líquidos e alguns são calculados apenas com eles em stake, então o próximo passo vai ser melhorar isso.

Algumas coisas relacionadas a formatação também precisam ser melhoradas, quero colocar tudo com 8 casas decimais que é o máximo que aparece nas carteiras e com isso já alinhar o máximo possível com elas. Falta passar tudo para o inglês também para ficar mais chamativo e melhorar alguns pequenos pontos.


<center>![image.png](https://files.peakd.com/file/peakd-hive/shiftrox/EnyoXowYFPrm7hVQLp5XrK8mTW9zev7PYiMC3jhX76AC6iY2TcYHDPyexodeSbZh7de.png)</center>

Já tem um totalizador também, então neste caso nos últimos 7 dias eu ganhei um total de 20,75 Hive apenas com dividendos e transferências de outros perfis. Não tenho interesse no momento de montar algo em HBD, até porque nenhum token que eu tenho paga em HBD.

Uma coisa que sempre me deixou em dúvidas é em relação a data de filtro. Na minha mente devo pegar sempre 7 dias, mas acho que não deveria contar o dia de hoje, mas dei uma olhada no https://hivestats.io/@shiftrox e vi que a opção de 7 dias deles filtra exatamente (Hoje - 7 dias), então eles pegam por exemplo tudo o que aconteceu no dia 25/04/2025 até hoje, 01/05/2025).

```
Criterio += "AND tt.[timestamp] >= DATEADD(day, -7, GETDATE())" + "\r\n"

```

Não sei se para dividendos isso é bom, talvez mude para considerar (ontem - 7 dias), ou seja, trabalhar com dias já "fechados", mas enfim, tem muita coisa por aí ainda para melhorar.

Por fim o código para buscar a quantidade de token que tenho e o seu menor valor de mercado foi:

```
import requests

from hiveengine.api import Api
#from hiveengine.market import Market

def obter_menor_preco_venda(token_symbol):
    try:

        api = Api()

        sell_orders = api.find("market", "sellBook", query={"symbol": token_symbol})
        
        if not sell_orders:
            #return f"Não há ordens de venda para o token {token_symbol} no momento."
            return 0

        menor_preco = min(sell_orders, key=lambda x: float(x['price']))

        dblMenorPreco = str(menor_preco['price'])

        return dblMenorPreco
    
    except Exception as e:
        print(f"Ocorreu um erro: {e}")
        return 0
    
def get_token_balance(usuario: str, token: str) -> float:
    
    url = "https://api.hive-engine.com/rpc/contracts"

    payload = {
        "jsonrpc": "2.0",
        "id": 1,
        "method": "find",
        "params": {
            "contract": "tokens",
            "table": "balances",
            "query": {
                "account": usuario,
                "symbol": token.upper()
            },
            "limit": 1
        }
    }

    try:
        response = requests.post(url, json=payload)
        response.raise_for_status()
        result = response.json().get("result", [])

        if result:
            return float(result[0].get("balance", 0.0))
        else:
            return 0.0

    except requests.RequestException as e:
        print(f"Erro ao acessar a API: {e}")
        return 0.0

```

get_token_balance() saiu como busca por URL, depois quero ver se consigo encontrar via a biblioteca do hiveengine, assim como foi feita a função obter_menor_preco_venda(), para centralizar tudo no uso da API. 

Enfim, primeiros passos dados, agora é queimar neurônio para fazer mais! Primeiro você começa e depois você melhora! 


![divisor.jpeg](https://files.peakd.com/file/peakd-hive/shiftrox/23x1ckYW5T1yBx8C6Rd18NYtNsYsNdTGDvWitn6Bsn1q5DyNyNy6dyVhLcGh4zUN1vwui.jpeg)


<center>[![banner_hiver_br_01.png](https://images.ecency.com/DQmcTb42obRrjKQYdtH2ZXjyQb1pn7HNgFgMpTeC6QKtPu4/banner_hiver_br_01.png)](https://discord.com/invite/kg6ee3vtrp)</center>

___

<center><div>

<center>***Follow me on [X (Formerly Twitter)](https://twitter.com/YanPatrick_)***</center>

🔹Hive Games🔹

**🔸[Splinterlands](https://splinterlands.com?ref=shiftrox)🔸[Holozing](https://holozing.com?ref=shiftrox)🔸[Terracore](https://www.terracoregame.com/?ref=shiftrox)🔸[dCrops](https://www.dcrops.com/?ref=shiftrox)🔸[Rising Star](https://www.risingstargame.com?referrer=shiftrox)🔸**

</div></center>


</div>
👍  , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , and 282 others
properties (23)
authorshiftrox
permlinkenpt-br-python-on-holiday-and-joy-in-advancing-even-further-in-programming-with-hive
categoryhive-139531
json_metadata{"app":"peakd/2025.4.6","format":"markdown","image":["https://files.peakd.com/file/peakd-hive/shiftrox/23uQP16uRLtTXkoTUiCauF7UuvNFDqGC3deZRaRD53WSwX1Tp86FNrT2iRLuq1RCoxs4d.png","https://files.peakd.com/file/peakd-hive/shiftrox/23x1ckYW5T1yBx8C6Rd18NYtNsYsNdTGDvWitn6Bsn1q5DyNyNy6dyVhLcGh4zUN1vwui.jpeg","https://files.peakd.com/file/peakd-hive/shiftrox/EobzjubTQXb55Eibc8rYHmsniX6NTHoxvucDDLAkMUMHcGv9WmWfMrhatzuZfMYe2dm.png","https://files.peakd.com/file/peakd-hive/shiftrox/23swibVedaRCRaXyDguxPykaaL8sbxcbmZskYLqef7CjT6cN5fHWLv9QF3Ey4TmwWBZGr.png","https://files.peakd.com/file/peakd-hive/shiftrox/EnyoXowYFPrm7hVQLp5XrK8mTW9zev7PYiMC3jhX76AC6iY2TcYHDPyexodeSbZh7de.png","https://images.hive.blog/DQmXEDLx7R7Jy8fHF1tA7sPyb4A9QBvmj3y4FjFhephcPCS/sun_divisor.webp","https://images.ecency.com/DQmcTb42obRrjKQYdtH2ZXjyQb1pn7HNgFgMpTeC6QKtPu4/banner_hiver_br_01.png"],"tags":["neoxian","code","python","hivebr","pob","cent","hive-engine","hive","programming","bbh"],"users":["mengao","gwajnberg","shiftrox"]}
created2025-05-01 18:51:12
last_update2025-05-01 18:54:39
depth0
children13
last_payout2025-05-08 18:51:12
cashout_time1969-12-31 23:59:59
total_payout_value4.650 HBD
curator_payout_value4.852 HBD
pending_payout_value0.000 HBD
promoted0.000 HBD
body_length21,141
author_reputation704,559,553,386,411
root_title"[EN/PT-BR] Python on holiday and joy in advancing even further in programming with Hive!"
beneficiaries
0.
accounthive-br
weight500
max_accepted_payout1,000,000.000 HBD
percent_hbd10,000
post_id142,443,976
net_rshares29,461,886,415,991
author_curate_reward""
vote details (346)
@aftabirshad ·
Well done 👍. Great Job.
properties (22)
authoraftabirshad
permlinksvo05z
categoryhive-139531
json_metadata{"app":"hiveblog/0.1"}
created2025-05-03 02:56:27
last_update2025-05-03 02:56:27
depth1
children3
last_payout2025-05-10 02:56:27
cashout_time1969-12-31 23:59:59
total_payout_value0.000 HBD
curator_payout_value0.000 HBD
pending_payout_value0.000 HBD
promoted0.000 HBD
body_length23
author_reputation7,981,073,157,291
root_title"[EN/PT-BR] Python on holiday and joy in advancing even further in programming with Hive!"
beneficiaries[]
max_accepted_payout1,000,000.000 HBD
percent_hbd10,000
post_id142,475,847
net_rshares0
@shiftrox ·
Thanks my friend, now it's time to develop more things hehe
properties (22)
authorshiftrox
permlinkre-aftabirshad-svo0t0
categoryhive-139531
json_metadata{"tags":["hive-139531"],"app":"peakd/2025.4.6","image":[],"users":[]}
created2025-05-03 03:10:12
last_update2025-05-03 03:10:12
depth2
children2
last_payout2025-05-10 03:10:12
cashout_time1969-12-31 23:59:59
total_payout_value0.000 HBD
curator_payout_value0.000 HBD
pending_payout_value0.000 HBD
promoted0.000 HBD
body_length59
author_reputation704,559,553,386,411
root_title"[EN/PT-BR] Python on holiday and joy in advancing even further in programming with Hive!"
beneficiaries[]
max_accepted_payout1,000,000.000 HBD
percent_hbd10,000
post_id142,475,967
net_rshares0
@aftabirshad ·
Okay. Keep working like this, success will touch your feet with excellent results and it is waiting for you.
**Well done**
properties (22)
authoraftabirshad
permlinksvo0zl
categoryhive-139531
json_metadata{"app":"hiveblog/0.1"}
created2025-05-03 03:14:12
last_update2025-05-03 03:14:12
depth3
children1
last_payout2025-05-10 03:14:12
cashout_time1969-12-31 23:59:59
total_payout_value0.000 HBD
curator_payout_value0.000 HBD
pending_payout_value0.000 HBD
promoted0.000 HBD
body_length122
author_reputation7,981,073,157,291
root_title"[EN/PT-BR] Python on holiday and joy in advancing even further in programming with Hive!"
beneficiaries[]
max_accepted_payout1,000,000.000 HBD
percent_hbd10,000
post_id142,476,025
net_rshares0
@hive-br.voter ·
comment
<center>This post was curated by @hive-br team!</center>

<center>![banner_hiver_br_01.png](https://images.ecency.com/DQmcTb42obRrjKQYdtH2ZXjyQb1pn7HNgFgMpTeC6QKtPu4/banner_hiver_br_01.png)</center>

<center>Delegate your HP to the [hive-br.voter](https://ecency.com/@hive-br.voter/wallet) account and earn Hive daily!</center>

| | | | | |
|----|----|----|----|----|
|<center>[50 HP](https://hivesigner.com/sign/delegateVestingShares?&delegatee=hive-br.voter&vesting_shares=50%20HP)</center>|<center>[100 HP](https://hivesigner.com/sign/delegateVestingShares?&delegatee=hive-br.voter&vesting_shares=100%20HP)</center>|<center>[200 HP](https://hivesigner.com/sign/delegateVestingShares?&delegatee=hive-br.voter&vesting_shares=200%20HP)</center>|<center>[500 HP](https://hivesigner.com/sign/delegateVestingShares?&delegatee=hive-br.voter&vesting_shares=500%20HP)</center>|<center>[1000 HP](https://hivesigner.com/sign/delegateVestingShares?&delegatee=hive-br.voter&vesting_shares=1000%20HP)</center>|

<center>🔹 Follow our [Curation Trail](https://hive.vote/dash.php?i=1&trail=hive-br.voter) and don't miss voting! 🔹</center>
properties (22)
authorhive-br.voter
permlinkhivebr-8o62eh5jl48
categoryhive-139531
json_metadata{}
created2025-05-01 18:51:36
last_update2025-05-01 18:51:36
depth1
children0
last_payout2025-05-08 18:51:36
cashout_time1969-12-31 23:59:59
total_payout_value0.000 HBD
curator_payout_value0.000 HBD
pending_payout_value0.000 HBD
promoted0.000 HBD
body_length1,126
author_reputation611,920,484,417
root_title"[EN/PT-BR] Python on holiday and joy in advancing even further in programming with Hive!"
beneficiaries
0.
accounthive-br.voter
weight10,000
max_accepted_payout10,000.000 HBD
percent_hbd10,000
post_id142,443,986
net_rshares0
@hivebuzz ·
Congratulations @shiftrox! You received a personal badge!

<table><tr><td>https://images.hive.blog/70x70/https://hivebuzz.me/badges/pud.s3.png?202505020006</td><td>You powered-up at least 100 HP on Hive Power Up Day! This entitles you to a level 3 badge<br>Participate in the next Power Up Day and try to power-up more HIVE to get a bigger Power-Bee.<br>May the Hive Power be with you!</td></tr></table>

<sub>_You can view your badges on [your board](https://hivebuzz.me/@shiftrox) and compare yourself to others in the [Ranking](https://hivebuzz.me/ranking)_</sub>


**Check out our last posts:**
<table><tr><td><a href="/hive-122221/@hivebuzz/pum-202504-result"><img src="https://images.hive.blog/64x128/https://i.imgur.com/mzwqdSL.png"></a></td><td><a href="/hive-122221/@hivebuzz/pum-202504-result">Hive Power Up Month Challenge - April 2025 Winners List</a></td></tr><tr><td><a href="/hive-122221/@hivebuzz/pum-202505"><img src="https://images.hive.blog/64x128/https://i.imgur.com/M9RD8KS.png"></a></td><td><a href="/hive-122221/@hivebuzz/pum-202505">Be ready for the May edition of the Hive Power Up Month!</a></td></tr><tr><td><a href="/hive-122221/@hivebuzz/pud-202505"><img src="https://images.hive.blog/64x128/https://i.imgur.com/805FIIt.jpg"></a></td><td><a href="/hive-122221/@hivebuzz/pud-202505">Hive Power Up Day - May 1st 2025</a></td></tr></table>
properties (22)
authorhivebuzz
permlinknotify-1746145507
categoryhive-139531
json_metadata{"image":["https://hivebuzz.me/notify.t6.png"]}
created2025-05-02 00:25:06
last_update2025-05-02 00:25:06
depth1
children0
last_payout2025-05-09 00:25:06
cashout_time1969-12-31 23:59:59
total_payout_value0.000 HBD
curator_payout_value0.000 HBD
pending_payout_value0.000 HBD
promoted0.000 HBD
body_length1,365
author_reputation369,393,100,435,476
root_title"[EN/PT-BR] Python on holiday and joy in advancing even further in programming with Hive!"
beneficiaries[]
max_accepted_payout1,000,000.000 HBD
percent_hbd10,000
post_id142,450,014
net_rshares0
@hivebuzz ·
Congratulations @shiftrox! You received a personal badge!

<table><tr><td>https://images.hive.blog/70x70/https://hivebuzz.me/badges/puh.png?202505020009</td><td>You made another user happy by powering him up some HIVE on Hive Power Up Day and got awarded this Power Up Helper badge.</td></tr></table>

<sub>_You can view your badges on [your board](https://hivebuzz.me/@shiftrox) and compare yourself to others in the [Ranking](https://hivebuzz.me/ranking)_</sub>


**Check out our last posts:**
<table><tr><td><a href="/hive-122221/@hivebuzz/pum-202504-result"><img src="https://images.hive.blog/64x128/https://i.imgur.com/mzwqdSL.png"></a></td><td><a href="/hive-122221/@hivebuzz/pum-202504-result">Hive Power Up Month Challenge - April 2025 Winners List</a></td></tr><tr><td><a href="/hive-122221/@hivebuzz/pum-202505"><img src="https://images.hive.blog/64x128/https://i.imgur.com/M9RD8KS.png"></a></td><td><a href="/hive-122221/@hivebuzz/pum-202505">Be ready for the May edition of the Hive Power Up Month!</a></td></tr><tr><td><a href="/hive-122221/@hivebuzz/pud-202505"><img src="https://images.hive.blog/64x128/https://i.imgur.com/805FIIt.jpg"></a></td><td><a href="/hive-122221/@hivebuzz/pud-202505">Hive Power Up Day - May 1st 2025</a></td></tr></table>
properties (22)
authorhivebuzz
permlinknotify-1746145677
categoryhive-139531
json_metadata{"image":["https://hivebuzz.me/notify.t6.png"]}
created2025-05-02 00:27:57
last_update2025-05-02 00:27:57
depth1
children2
last_payout2025-05-09 00:27:57
cashout_time1969-12-31 23:59:59
total_payout_value0.000 HBD
curator_payout_value0.000 HBD
pending_payout_value0.000 HBD
promoted0.000 HBD
body_length1,262
author_reputation369,393,100,435,476
root_title"[EN/PT-BR] Python on holiday and joy in advancing even further in programming with Hive!"
beneficiaries[]
max_accepted_payout1,000,000.000 HBD
percent_hbd10,000
post_id142,450,090
net_rshares0
@shiftrox ·
Thanks!!! 
properties (22)
authorshiftrox
permlinkre-hivebuzz-svmz3b
categoryhive-139531
json_metadata{"tags":["hive-139531"],"app":"peakd/2025.4.6","image":[],"users":[]}
created2025-05-02 13:35:36
last_update2025-05-02 13:35:36
depth2
children1
last_payout2025-05-09 13:35:36
cashout_time1969-12-31 23:59:59
total_payout_value0.000 HBD
curator_payout_value0.000 HBD
pending_payout_value0.000 HBD
promoted0.000 HBD
body_length10
author_reputation704,559,553,386,411
root_title"[EN/PT-BR] Python on holiday and joy in advancing even further in programming with Hive!"
beneficiaries[]
max_accepted_payout1,000,000.000 HBD
percent_hbd10,000
post_id142,461,005
net_rshares0
@hivebuzz ·
Congratulations @shiftrox! You're doing your part to make Hive stronger. Keep up the great work! 💪<div><a href="https://engage.hivechain.app">![](https://i.imgur.com/XsrNmcl.png)</a></div>
properties (22)
authorhivebuzz
permlinkre-1746199681000
categoryhive-139531
json_metadata{"app":"engage"}
created2025-05-02 15:28:00
last_update2025-05-02 15:28:00
depth3
children0
last_payout2025-05-09 15:28:00
cashout_time1969-12-31 23:59:59
total_payout_value0.000 HBD
curator_payout_value0.000 HBD
pending_payout_value0.000 HBD
promoted0.000 HBD
body_length188
author_reputation369,393,100,435,476
root_title"[EN/PT-BR] Python on holiday and joy in advancing even further in programming with Hive!"
beneficiaries[]
max_accepted_payout1,000,000.000 HBD
percent_hbd10,000
post_id142,463,585
net_rshares0
@hivepakistan ·
<center>**Curious about HivePakistan? Join us on [Discord](https://discord.gg/3FzxCqFYyG)!**</center>

<center>Delegate your HP to the [Hivepakistan](https://peakd.com/@hivepakistan/wallet) account and earn 90% of curation rewards in liquid hive!<br><br><center><table><tr><td><center>[50 HP](https://hivesigner.com/sign/delegateVestingShares?&delegatee=hivepakistan&vesting_shares=50%20HP)</center></td><td><center>[100 HP](https://hivesigner.com/sign/delegateVestingShares?&delegatee=hivepakistan&vesting_shares=100%20HP)</center></td><td><center>[200 HP](https://hivesigner.com/sign/delegateVestingShares?&delegatee=hivepakistan&vesting_shares=200%20HP)</center></td><td><center>[500 HP (Supporter Badge)](https://hivesigner.com/sign/delegateVestingShares?&delegatee=hivepakistan&vesting_shares=500%20HP)</center></td><td><center>[1000 HP](https://hivesigner.com/sign/delegateVestingShares?&delegatee=hivepakistan&vesting_shares=1000%20HP)</center></td></tr></table></center>
<center>Follow our [Curation Trail](https://hive.vote/dash.php?i=1&trail=hivepakistan) and don't miss voting!</center>
___
<center>**Additional Perks: Delegate To @ [pakx](https://peakd.com/@pakx) For Earning $PAKX Investment Token**</center>

<center><img src="https://files.peakd.com/file/peakd-hive/dlmmqb/23tkn1F4Yd2BhWigkZ46jQdMmkDRKagirLr5Gh4iMq9TNBiS7anhAE71y9JqRuy1j77qS.png"></center><hr><center><b>Curated by <a href="/@gwajnberg">gwajnberg</a></b></center>
properties (22)
authorhivepakistan
permlinkre-shiftrox-1746125867
categoryhive-139531
json_metadata"{"tags": ["hive-139531"], "app": "HiveDiscoMod"}"
created2025-05-01 18:57:45
last_update2025-05-01 18:57:45
depth1
children0
last_payout2025-05-08 18:57:45
cashout_time1969-12-31 23:59:59
total_payout_value0.000 HBD
curator_payout_value0.000 HBD
pending_payout_value0.000 HBD
promoted0.000 HBD
body_length1,446
author_reputation123,496,422,484,996
root_title"[EN/PT-BR] Python on holiday and joy in advancing even further in programming with Hive!"
beneficiaries[]
max_accepted_payout1,000,000.000 HBD
percent_hbd10,000
post_id142,444,113
net_rshares0
@perfilbrasil ·
Obrigado por postar
Obrigado por promover a comunidade Hive-BR em suas postagens.

Vamos seguir fortalecendo a Hive

<div class='text-right'><sup>Metade das recompensas dessa resposta serão destinadas ao autor do post.</sup></div><hr/><h4><center><a href='https://vote.hive.uno/@perfilbrasil'>Vote no @perfilbrasil para Testemunha Hive.</a></center></h4>
properties (22)
authorperfilbrasil
permlinkre-enpt-br-python-on-holiday-and-joy-in-advancing-even-further-in-programming-with-hive
categoryhive-139531
json_metadata"{"tags": ["pt"], "description": "Obrigado por postar"}"
created2025-05-01 18:55:06
last_update2025-05-01 18:55:06
depth1
children0
last_payout2025-05-08 18:55:06
cashout_time1969-12-31 23:59:59
total_payout_value0.000 HBD
curator_payout_value0.000 HBD
pending_payout_value0.000 HBD
promoted0.000 HBD
body_length334
author_reputation12,937,908,833,028
root_title"[EN/PT-BR] Python on holiday and joy in advancing even further in programming with Hive!"
beneficiaries
0.
accountshiftrox
weight5,000
max_accepted_payout10.000 HBD
percent_hbd0
post_id142,444,044
net_rshares0
@pixbee ·
Bzzzrrr, que legal que você está de volta ao mundo do programação! Fico feliz que tenha conseguido superar os problemas e agora tenha uma versão funcional do Free Translator. Parabéns pelo esforço e pela paciência!

#hivebr

<sub>*AI generated content*
Commands: !pixbee stop | !pixbee start | !pixbee price</sub>
properties (22)
authorpixbee
permlinkre-shiftrox-enpt-br-python-on-holiday-and-joy-in-advancing-even-further-in-programming-with-hive-20250501t190421626z
categoryhive-139531
json_metadata{"app":"PixBee/1.3.7","tags":""}
created2025-05-01 19:04:24
last_update2025-05-01 19:04:24
depth1
children0
last_payout2025-05-08 19:04:24
cashout_time1969-12-31 23:59:59
total_payout_value0.000 HBD
curator_payout_value0.000 HBD
pending_payout_value0.000 HBD
promoted0.000 HBD
body_length313
author_reputation4,469,331,421,926
root_title"[EN/PT-BR] Python on holiday and joy in advancing even further in programming with Hive!"
beneficiaries
0.
accountnull
weight10,000
max_accepted_payout1,000,000.000 HBD
percent_hbd10,000
post_id142,444,245
net_rshares0
@shiftrox ·
!hbits !hiqvote
properties (22)
authorshiftrox
permlinkre-shiftrox-svlja1
categoryhive-139531
json_metadata{"tags":["hive-139531"],"app":"peakd/2025.4.6","image":[],"users":[]}
created2025-05-01 18:56:27
last_update2025-05-01 18:56:27
depth1
children0
last_payout2025-05-08 18:56:27
cashout_time1969-12-31 23:59:59
total_payout_value0.000 HBD
curator_payout_value0.000 HBD
pending_payout_value0.000 HBD
promoted0.000 HBD
body_length15
author_reputation704,559,553,386,411
root_title"[EN/PT-BR] Python on holiday and joy in advancing even further in programming with Hive!"
beneficiaries[]
max_accepted_payout1,000,000.000 HBD
percent_hbd10,000
post_id142,444,075
net_rshares0