create account

Building a Hive Application as a Non-coder (AI use-case) by mightpossibly

View this thread on: hive.blogpeakd.comecency.com
· @mightpossibly ·
$21.68
Building a Hive Application as a Non-coder (AI use-case)
In this [post](https://inleo.io/@leoglossary/leoglossary-post-hive) I'll be breaking down how I was able to create an [application](https://inleo.io/@leoglossary/leoglossary-application) that interacts with the [Hive](https://inleo.io/@leoglossary/leoglossary-what-is-hive) [blockchain](https://inleo.io/@leoglossary/blockchain) using an LLM (ChatGPT in this particular case). The intention with the post is to bring attention to some of the possibilities currently at our feet, by showing a very specific step-by-step approach to problem-solving using [AI](https://inleo.io/@leoglossary/leoglossary-artificial-intelligence). Hopefully you'll get some ideas of your own, or at the very least get an interesting read out of it.

 https://img.inleo.io/DQmUezL4f5mN1s5T4hkhVjxwmghQPdcDv6cXHeadBeWh4uf/image.png 

---


## Background: Summaries, summaries, summaries


You may have noticed some of my [Threads](https://inleo.io/@leoglossary/loeglossary-threads) lately, about the summaries for various [podcasts](https://inleo.io/@leoglossary/leoglossary-podcast) being out. The idea was born a little more than a month ago, after an interaction with @coyotelation, where he pointed out how he as a non-english speaker had big problems following along with the ongoing Lion's Den [episode](https://inleo.io/@leoglossary/leoglossary-episode-television). Even though he really wanted to take part in the discussion and gain access to the information, his possibilities were limited because of the language barrier.


This sparked the idea: What if I could somehow transcribe the episode and then feed that to an LLM to generate a summary of the episode? This would solve @coyotelation's (and surely many other's) problem, but it would also do a lot more than that: It would make valuable Hive-content previously only available in audiovisual format, indexable by Search engines, not to mention LeoAI, LeoSearch and other search engines specialized on indexing Hive content.


And sure enough, with a bit of research I figured out how I could download the recordings and run them through ASR (Automated Speech Recognition) [software](https://inleo.io/@leoglossary/leoglossary-software) that it turned out I could run on my own system. Quite resource intensive, but still. It worked! The software is called Whisper AI, and it takes my [computer](https://inleo.io/@leoglossary/leoglossary-computer) only about 15 minutes to transcribe near flawlessly a 2 hour video/audio recording. I should probably mention, though, that it's a quite powerful computer with a quite powerful graphics card as well.


With the first few transcriptions created, I needed to craft a prompt that would consistently generate good summaries. After hours of experimentation I landed on something that was somewhat consistent. It still needs manual supervision and input, but it's close enough.  Here's the full prompt if you're interested:


```
Your role is to analyze and summarize transcripts of Twitter Spaces live podcasts uploaded as .srt files. You will provide a detailed analysis of the entire transcript, crafting a comprehensive breakdown that includes:


1. A brief summary/abstract of the episode.
2. A bullet point list of the main topics discussed, with emphasis on the specifics, such as project names and key concepts.
3. A detailed article on each topic, elaborating on what speakers said, with a focus on specifics rather than a high-level overview, including discussions and names of specific projects mentioned during the episode. 


You should aim to present the information clearly and concisely, ensuring that the target audience, who could not attend or listen to the recording, gains a thorough understanding of the episode. Your responses should be tailored to provide insights and detailed accounts of the discussions, making the content accessible and informative.
```


No less than 27 podcast summaries has been posted to the @ai-summaries account since the project commenced a little over a month ago (noone can claim that Hive doesn't have an active podcasting community!).


Anyways, on to today's use-case...


---


## Summarizing a large amount of [3Speak](https://inleo.io/@leoglossary/leoglossary-3speak-hive) [videos](https://inleo.io/@leoglossary/leoglossary-video)
After a few weeks of posting summaries, @taskmaster4450 approached me and asked whether it would be possible to somehow summarize his more than 2500 videos for the purpose of adding their [data](https://inleo.io/@leoglossary/leoglossary-data) to the [Hive Database](https://inleo.io/@leoglossary/leoglossary-hive-database) thus making their contents available to LeoAI. 


Up until then, the summarization process was both manual and quite laborious for me, and my immediate thought was: Forget it, no way, that's simply not viable. I have no coding experience nor coding competence and this is simply unachievable with my current competence level.


But then I started thinking. Which specific steps would be required to actually accomplish this? And could I perhaps accomplish at least some of it with the help of AI? I tried breaking down the imagined necessay steps and came up with this*: 
1. Scrape a list of aLL the videos in the channel including their unique identifiers (hive permlinks)
2. Automatically download the videos in the list
3. Run ASR on each video, outputting 1 transcript per video
4. Run each transcript through [ChatGPT](https://inleo.io/@leoglossary/leoglossary-chatgpt) using a pre-defined prompt to summarize the transcript, outputting 1 summary per transcript
5. Post each summary to its corresponding video as a [comment](https://inleo.io/@leoglossary/leoglossary-comment-hive)


*Actually, I initially only came up with the first two steps, doubting I would even accomplish those. More on the specifics on the process below.


---


### Problem 1: Generate a list of the channel videos
Prompt:



> I have an Ubuntu WSL environment set up already, and node-js is all set up and ready to go, along with python3. I need help writing a [script](https://inleo.io/@leoglossary/leoglossary-script) that will scrape the entire 3speak.tv channel "https://3speak.tv/user/taskmaster4450" and list each video that has been posted to that account. The script will output a csv file that lists each video in reverse chronological order in the format video_url,video_name. An example of a specific video URL in the channel is https://3speak.tv/watch?v=taskmaster4450/jbonizwn



Output:
 https://img.inleo.io/DQmX6NDywroJYRoM5wextBQqDtcePaXAwKfijiXDt2JBdCd/image.png 


I tried running the script* and to my surprise I was left with a shiny csv-file containing all the videos in the channel in chronological order, like so:


 https://img.inleo.io/DQmYAJgGs92rQKpc6djDQnnRdYKwkUdZCdB9eAUe2DvQDjU/image.png 
*I actually didn't know how to run the script, so I prompted for that as well. "How do I run a nodejs-script"


But then I realized I'm gonna need the video identifier isolated as well, which is the Hive permlink, so I asked for some adjustments:


> Great. Let's build further on this. I need the outputted csv file adjusted. There must be an additional column: Before the two current columns (video_url,video_name) i need to isolate the unique video identifier part of the URL. So if we use the initial example, the first row (let's call it video_id) would simply display 'jbonizwn'


 https://img.inleo.io/DQmTsSfj2jqnYJvJsQDDNwB4C3x7eRvrJSjw763onXPYquY/image.png 


And then I realized I didn't really need the full URL of the video at all, only the identifier:
> this worked perfectly. And [nOW](https://inleo.io/@leoglossary/leoglossary-negotiable-order-of-withdrawal-account-now) that I think of it, I don't need the CSV to even display the video_url, I only need the video_id and the [title](https://inleo.io/@leoglossary/leoglossary-title-real-estate). Rewrite the whole script to accomodate for this


 https://img.inleo.io/DQmb21acQnmAMhcFeqvDhWCNQkcjh1po4PGUh6R8n1b5XuN/image.png 




And here's an excerpt of its final output:
 https://img.inleo.io/DQmb6BbSDmbJjvPYuec6L8SwrkhDnUnwikzRnCUpwEBk4Qw/image.png 


Problem 1 solved. On to..


---


### Problem 2: Auto-download the videos
From my initial manual summarizing, I was already familiar with the command tool ```yt-dlp```, which I've used to download the Twitter/X-recordings to transcribe them. But up until this point, I'd done them one recording at a time. To be able to handle thousands of videos, I needed to automate the process. But first things first, how do I properly name the videos to dynamically be named after the links they were fetched from?


Prompt:
> using yt-dlp on Ubuntu WSL, how can I download the video on the following URL and save the video file named after the code between the two /'s, i.e. in this example "brildwtb.mp4". ```https://threespeakvideo.b-cdn.net/brildwtb/default.m3u8```


 https://img.inleo.io/DQmNgykAwEiz8BprivcKaMxY6w9z9zakPgn9RX8bDrww4f6/image.png 
 https://img.inleo.io/DQma8FMKR1uTRa1XL5zECVMpjj8RjpUh9NN6x45rVoxTE64/image.png 


> Say I have a list of 100 such URLs. How could I feed this list into the script to download them all in one operation?


 https://img.inleo.io/DQmbThT2NBpzS6BM761EYU4PtajtjVGmF86JDVWJWB58D9h/image.png 


At this point I realized it would  probably be a good idea to number the videos too, so it would be easy for me to resume downloading in case the script would halt for some reason.


> I forgot I also need the output video to be prepended with a number and a dash. The number will increase by 1 for each line, but I need to be able to set in the script which number it should start counting from. So if we use our initial example and say that that example is the first line in urls.txt, and I select that the number it should start counting from is 37, our file output would be "37-brildwtb.mp4" 


 https://img.inleo.io/DQmdXZwYowzBQALBssponDEAiJRv4KTFmn49Bthi6QjTAAt/image.png 


And just like that, problem 2 was solved as well, resulting in a massive folder looking somewhat like this:


 https://img.inleo.io/DQmQvVZizQ44GWbo13NNX4MFTnQBtDJFnW6WFkrhNzGYdxx/image.png 


---


### Problem 3: Transcribe the videos with minimal manual input
This one turned out to be real simple. I already knew the command to transcribe a video from before, and I already got whisper set up on my computer. I just needed to run the command on all the files in a single folder. [note](https://inleo.io/@leoglossary/leoglossary-note) that the command 'more' is just a simple placeholder for the correct command, which would be something like ```whisper videofilename.mp4 --model medium --language en --output_format srt```


Prompt:


> On my [Windows](https://inleo.io/@leoglossary/leoglossary-windows-operating-system) computer, I have a folder containing a number of files. Using the CMD cli interface, I'd like help creating a single command that will run a specific command on each file in the folder that ends with .txt The command that should be run is "more"


 https://img.inleo.io/DQmeFphHYpwW1ZvidUft8hUgCzq9faL3b3HeLdwKpDiW3Bs/image.png 


Running this command on all the files is a different [story](https://inleo.io/@leoglossary/leoglossary-story) though. It took my computer approximately **24 hours of continuous work** to transcribe all 1700 videos.


---


### Problem 4: Generate summaries based on the transcripts


This was the problem I was most "afraid" of tackling, and is also the step I probably spent the most time on. I had some big problems getting GPT to cooperate with me. My intial prompt was something like "I need to interact with the [OpenAi](https://inleo.io/@leoglossary/leoglossary-openai) API to automatically summarize based on a set of text files". But because GPT-4 is trained on data only as recent as 2021, it served me code using deprecated API calls, and I simply could not get it to work with the updated calls as per the documentation. 


The solution turned out to be prompting from a completely different frame of reference: Instead of "I need to interact with the OpenAI API", I approached it from "I need help developing this script based on the following function". After which I proceeded to paste in the correct API call from the OpenAI Documentation. To my surprise, this worked almost flawlessly. But enough background, let's dive on in:


Prompt:
>I need help with refining a python script


 https://img.inleo.io/DQmTaUQbZArHYzu6x4c82eZdGs3agnhpT5YDHW1fEX1Pt69/image.png 
 https://img.inleo.io/DQmdBQNj9uyPf6S1yqUBB9fQ6quN7HB8iq2w6WeYQ65HJep/image.png 
 https://img.inleo.io/DQmW8d97cURxXeCGMBjeRmxzSpoKHSoPwQ23h1ppkEqAkS4/image.png 


And what do you know, it worked perfectly. But that hard part was still to come. Note how I'm very particular about describing exactly what I need the script to do.


> Excellent, that worked perfectly. Let's continue. 
>
>In the folder where the script will run, there are two subfolders: "transcripts" and "summaries". The 'transcripts' folder contains an undefined number of .srt files with unique names and contents.
>
>The script will look for files in the transcripts folder  and run the response function once for each file found. These files will also serve as the input_transcript-variable  that will update each time the function is run.
>
>For each file in the transcripts folder, the output of the response function will create a .txt file in the "summaries" folder with an identical name (with the exception of the file extension), containing the output of the function


 https://img.inleo.io/DQmYZZJUmwiBLRFm3afSyLfwqLyXR952vWArLGZrgia6Ubp/image.png 


Also this script worked exactly as intended. There were a few prerequisites I haven't mentioned, namely needing a paid OpenAI [developer](https://inleo.io/@leoglossary/leoglossary-developer-real-estate) account with API [access tokens](https://inleo.io/@leoglossary/leoglossary-access-token) to be allowed to do the API calls. I'll note though, that for running the summaries on said 1700 transcripts, it only [cost](https://inleo.io/@leoglossary/leoglossary-cost) me like 5 [USD](https://inleo.io/@leoglossary/leoglossary-usd), which was a surprisingly low sum according to my expectations.


It took about 3-4 hours for the script to complete all the summaries.


---


### Step 5: Posting the summaries to the blockchain
With a folder full of summaries, all that NOW remained was posting said summaries to their corresponding videos. I wanted them to be posted as comments rather than individual posts. It was also an important point for me to decline payout on the comments to stay in tune with the stated project goals.


Prompt:
 https://img.inleo.io/DQmU6JLVmG7PaGe1dAM9DZZrjakQ3UeUywWMPKHqKYd3D2N/image.png 


 https://img.inleo.io/DQmYyvCHSmqTMSDhiuLfF77ukobLS6i9ZoBZRiETEYVnZxC/image.png 


I had a few back and forths on this one, but eventually I got it working. Which is when I posted this thread:
https://inleo.io/threads/view/mightpossibly/re-leothreads-ws2kqgt7


The script took about 2,5 hours to successfully post all the summaries to their corresponding videos. If you want to see some of the resulting summaries, [check](https://inleo.io/@leoglossary/leoglossary-check-cheque) the comment section of the thread above (I dropped some links in there).


---


## closing remarks
If you made it all the way down here, you've probably realized I've left out a bunch of details, as this is not a tutorial for how to achieve this particular thing, but rather how AI can be used to solve problems like this. By breaking it down to simple steps, it makes it possible to actually get the program doing what you want it to do – which is what this whole article is intended to show.


If you're a coder, you're probably tearing your hair out seeing this cumbersome process [play](https://inleo.io/@leoglossary/leoglossary-play-theater) out by such an amateur. And if you're a non-coder like me, you probably think this looks like pure magic... or at least somewhere in between.


I have plans to refine this whole process further, and will attempt to tie several of the steps together so as to reduce manual input even further. But just in its current state, I have achieved results I would never have been able to if I were to do it manually.


Finally, if you have any questions about any of this, if you got any ideas from reading it or anything else that may be related, I'd [love](https://inleo.io/@leoglossary/leoglossary-love) to hear about it in the comment section below.


---

<center> If you found this interesting, feel free to leave a comment, upvote or reblog.

https://i.imgur.com/r815k0y.gif

Thank you for reading!</center>

---

### What is Hive?
To learn more about Hive, this article is a good place to start: [What is Hive?](https://inleo.io/@leoglossary/leoglossary-what-is-hive). If you don't already own a Hive account, [go here to get one](https://inleo.io/signup?referral=mightpossibly).

---

@leoglossary links added using [LeoLinker](https://inleo.io/@kendewitt/what-is-leolinker).


Posted Using [InLeo Alpha](https://inleo.io/@mightpossibly/building-a-hive-application-as-a-noncoder-ai-usecase)
👍  , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , and 370 others
properties (23)
authormightpossibly
permlinkbuilding-a-hive-application-as-a-noncoder-ai-usecase
categoryhive-167922
json_metadata{"app":"leothreads/0.3","format":"markdown","tags":["hive-167922","inleo","llm","programming","problemsolving","hive"],"canonical_url":"https://inleo.io/@mightpossibly/building-a-hive-application-as-a-noncoder-ai-usecase","links":["https://inleo.io/@leoglossary/leoglossary-post-hive)","https://inleo.io/@leoglossary/leoglossary-application)","https://inleo.io/@leoglossary/leoglossary-what-is-hive)","https://inleo.io/@leoglossary/blockchain)","https://inleo.io/@leoglossary/leoglossary-artificial-intelligence).","https://inleo.io/@leoglossary/loeglossary-threads)","https://inleo.io/@leoglossary/leoglossary-podcast)","https://inleo.io/@leoglossary/leoglossary-episode-television).","https://inleo.io/@leoglossary/leoglossary-software)","https://inleo.io/@leoglossary/leoglossary-computer)","https://inleo.io/@leoglossary/leoglossary-3speak-hive)","https://inleo.io/@leoglossary/leoglossary-video)","https://inleo.io/@leoglossary/leoglossary-data)","https://inleo.io/@leoglossary/leoglossary-hive-database)","https://inleo.io/@leoglossary/leoglossary-chatgpt)","https://inleo.io/@leoglossary/leoglossary-comment-hive)","https://inleo.io/@leoglossary/leoglossary-script)","https://3speak.tv/user/taskmaster4450\"","https://3speak.tv/watch?v=taskmaster4450/jbonizwn","https://inleo.io/@leoglossary/leoglossary-negotiable-order-of-withdrawal-account-now)","https://inleo.io/@leoglossary/leoglossary-title-real-estate).","https://threespeakvideo.b-cdn.net/brildwtb/default.m3u8```","https://inleo.io/@leoglossary/leoglossary-note)","https://inleo.io/@leoglossary/leoglossary-windows-operating-system)","https://inleo.io/@leoglossary/leoglossary-story)","https://inleo.io/@leoglossary/leoglossary-openai)","https://inleo.io/@leoglossary/leoglossary-developer-real-estate)","https://inleo.io/@leoglossary/leoglossary-access-token)","https://inleo.io/@leoglossary/leoglossary-cost)","https://inleo.io/@leoglossary/leoglossary-usd),","https://inleo.io/threads/view/mightpossibly/re-leothreads-ws2kqgt7","https://inleo.io/@leoglossary/leoglossary-check-cheque)","https://inleo.io/@leoglossary/leoglossary-play-theater)","https://inleo.io/@leoglossary/leoglossary-love)","https://inleo.io/@leoglossary/leoglossary-what-is-hive).","https://inleo.io/signup?referral=mightpossibly).","https://inleo.io/@kendewitt/what-is-leolinker).","https://inleo.io/@mightpossibly/building-a-hive-application-as-a-noncoder-ai-usecase)"],"images":["https://img.inleo.io/DQmUezL4f5mN1s5T4hkhVjxwmghQPdcDv6cXHeadBeWh4uf/image.png","https://img.inleo.io/DQmX6NDywroJYRoM5wextBQqDtcePaXAwKfijiXDt2JBdCd/image.png","https://img.inleo.io/DQmYAJgGs92rQKpc6djDQnnRdYKwkUdZCdB9eAUe2DvQDjU/image.png","https://img.inleo.io/DQmTsSfj2jqnYJvJsQDDNwB4C3x7eRvrJSjw763onXPYquY/image.png","https://img.inleo.io/DQmb21acQnmAMhcFeqvDhWCNQkcjh1po4PGUh6R8n1b5XuN/image.png","https://img.inleo.io/DQmb6BbSDmbJjvPYuec6L8SwrkhDnUnwikzRnCUpwEBk4Qw/image.png","https://img.inleo.io/DQmNgykAwEiz8BprivcKaMxY6w9z9zakPgn9RX8bDrww4f6/image.png","https://img.inleo.io/DQma8FMKR1uTRa1XL5zECVMpjj8RjpUh9NN6x45rVoxTE64/image.png","https://img.inleo.io/DQmbThT2NBpzS6BM761EYU4PtajtjVGmF86JDVWJWB58D9h/image.png","https://img.inleo.io/DQmdXZwYowzBQALBssponDEAiJRv4KTFmn49Bthi6QjTAAt/image.png","https://img.inleo.io/DQmQvVZizQ44GWbo13NNX4MFTnQBtDJFnW6WFkrhNzGYdxx/image.png","https://img.inleo.io/DQmeFphHYpwW1ZvidUft8hUgCzq9faL3b3HeLdwKpDiW3Bs/image.png","https://img.inleo.io/DQmTaUQbZArHYzu6x4c82eZdGs3agnhpT5YDHW1fEX1Pt69/image.png","https://img.inleo.io/DQmdBQNj9uyPf6S1yqUBB9fQ6quN7HB8iq2w6WeYQ65HJep/image.png","https://img.inleo.io/DQmW8d97cURxXeCGMBjeRmxzSpoKHSoPwQ23h1ppkEqAkS4/image.png","https://img.inleo.io/DQmYZZJUmwiBLRFm3afSyLfwqLyXR952vWArLGZrgia6Ubp/image.png","https://img.inleo.io/DQmU6JLVmG7PaGe1dAM9DZZrjakQ3UeUywWMPKHqKYd3D2N/image.png","https://img.inleo.io/DQmYyvCHSmqTMSDhiuLfF77ukobLS6i9ZoBZRiETEYVnZxC/image.png","https://i.imgur.com/r815k0y.gif"],"isPoll":false,"dimensions":{}}
created2024-03-23 16:45:06
last_update2024-03-23 16:45:06
depth0
children12
last_payout2024-03-30 16:45:06
cashout_time1969-12-31 23:59:59
total_payout_value10.598 HBD
curator_payout_value11.086 HBD
pending_payout_value0.000 HBD
promoted0.000 HBD
body_length17,137
author_reputation44,258,565,333,528
root_title"Building a Hive Application as a Non-coder (AI use-case)"
beneficiaries
0.
accountleofinance
weight500
max_accepted_payout1,000,000.000 HBD
percent_hbd0
post_id132,275,154
net_rshares39,867,376,906,898
author_curate_reward""
vote details (434)
@coyotelation ·
$0.05
That's why I value the importance of engagement. Through our conversation, look where we are now? You brought something very valuable to Hive and this should be widely publicized and promoted.

It is this type of work like yours that should be recognized and will undoubtedly be rewarded in different ways.

It had to be @taskmaster4450le to bring you a very laborious mission summarizing his thousands of videos. LOL

But I'm glad you managed to find an alternative to all of this.

Success on your journey my friend! You deserve all the good things that the Hive and Leo ecosystem can provide you.

May more ideas emerge from our interaction.
👍  
properties (23)
authorcoyotelation
permlinkre-mightpossibly-qtnra92x
categoryhive-167922
json_metadata{"app":"leothreads/0.3","format":"markdown","tags":["leofinance"],"canonical_url":"https://inleo.io/@coyotelation/re-mightpossibly-qtnra92x","isPoll":false,"pollOptions":{},"dimensions":[]}
created2024-03-23 19:41:09
last_update2024-03-23 19:41:09
depth1
children2
last_payout2024-03-30 19:41:09
cashout_time1969-12-31 23:59:59
total_payout_value0.026 HBD
curator_payout_value0.026 HBD
pending_payout_value0.000 HBD
promoted0.000 HBD
body_length644
author_reputation104,320,120,547,134
root_title"Building a Hive Application as a Non-coder (AI use-case)"
beneficiaries[]
max_accepted_payout1,000,000.000 HBD
percent_hbd10,000
post_id132,278,412
net_rshares96,070,142,023
author_curate_reward""
vote details (1)
@mightpossibly ·
Thank you fren. This is how we grow! I have big plans to expand this concept, and more podcasts/channels have already requested summaries too. Currently processing @bradleyarrow's channel. After that @jongolson and then @selfhelp4trolls. Hopefully the list will keep growing.
👍  
properties (23)
authormightpossibly
permlinkre-coyotelation-377vvb8su
categoryhive-167922
json_metadata{"app":"leothreads/0.3","format":"markdown","tags":["leofinance"],"canonical_url":"https://inleo.io/@mightpossibly/re-coyotelation-377vvb8su","isPoll":false,"pollOptions":{},"dimensions":[]}
created2024-03-24 09:46:27
last_update2024-03-24 09:46:27
depth2
children1
last_payout2024-03-31 09:46:27
cashout_time1969-12-31 23:59:59
total_payout_value0.000 HBD
curator_payout_value0.000 HBD
pending_payout_value0.000 HBD
promoted0.000 HBD
body_length275
author_reputation44,258,565,333,528
root_title"Building a Hive Application as a Non-coder (AI use-case)"
beneficiaries[]
max_accepted_payout1,000,000.000 HBD
percent_hbd10,000
post_id132,289,831
net_rshares24,617,993,616
author_curate_reward""
vote details (1)
@coyotelation ·
Glad to know that.  Here's to many more requests!
properties (22)
authorcoyotelation
permlinkre-mightpossibly-2fr7aj9bm
categoryhive-167922
json_metadata{"app":"leothreads/0.3","format":"markdown","tags":["leofinance"],"canonical_url":"https://inleo.io/@coyotelation/re-mightpossibly-2fr7aj9bm","isPoll":false,"pollOptions":{},"dimensions":[]}
created2024-03-24 17:09:57
last_update2024-03-24 17:09:57
depth3
children0
last_payout2024-03-31 17:09:57
cashout_time1969-12-31 23:59:59
total_payout_value0.000 HBD
curator_payout_value0.000 HBD
pending_payout_value0.000 HBD
promoted0.000 HBD
body_length49
author_reputation104,320,120,547,134
root_title"Building a Hive Application as a Non-coder (AI use-case)"
beneficiaries[]
max_accepted_payout1,000,000.000 HBD
percent_hbd10,000
post_id132,298,099
net_rshares0
@hivebuzz ·
Congratulations @mightpossibly! You have completed the following achievement on the Hive blockchain And have been rewarded with New badge(s)

<table><tr><td><img src="https://images.hive.blog/60x70/https://hivebuzz.me/@mightpossibly/upvotes.png?202403251422"></td><td>You distributed more than 63000 upvotes.<br>Your next target is to reach 64000 upvotes.</td></tr>
</table>

<sub>_You can view your badges on [your board](https://hivebuzz.me/@mightpossibly) and compare yourself to others in the [Ranking](https://hivebuzz.me/ranking)_</sub>
<sub>_If you no longer want to receive notifications, reply to this comment with the word_ `STOP`</sub>



**Check out our last posts:**
<table><tr><td><a href="/hive/@hivebuzz/birthday-8"><img src="https://images.hive.blog/64x128/https://files.peakd.com/file/peakd-hive/hivebuzz/48UH6maVydCEXmmaoUDSbS7oYV1nYyATQuGeLRJY3AFFunQW3RfwC1334cQRQaqMTH.png"></a></td><td><a href="/hive/@hivebuzz/birthday-8">Happy Birthday to the Hive Blockchain</a></td></tr></table>
properties (22)
authorhivebuzz
permlinknotify-mightpossibly-20240325t143251
categoryhive-167922
json_metadata{"image":["https://hivebuzz.me/notify.t6.png"]}
created2024-03-25 14:32:51
last_update2024-03-25 14:32:51
depth1
children0
last_payout2024-04-01 14:32:51
cashout_time1969-12-31 23:59:59
total_payout_value0.000 HBD
curator_payout_value0.000 HBD
pending_payout_value0.000 HBD
promoted0.000 HBD
body_length1,004
author_reputation367,907,014,297,979
root_title"Building a Hive Application as a Non-coder (AI use-case)"
beneficiaries[]
max_accepted_payout1,000,000.000 HBD
percent_hbd10,000
post_id132,320,256
net_rshares0
@hivebuzz ·
Congratulations @mightpossibly! You received a personal badge!

<table><tr><td>https://images.hive.blog/70x70/https://hivebuzz.me/badges/pud.png?202404011018</td><td>You powered-up at least 10 HIVE on Hive Power Up Day!<br>Wait until the end of Power Up Day to find out the size of your Power-Bee.<br>May the Hive Power be with you!
</td></tr></table>

<sub>_You can view your badges on [your board](https://hivebuzz.me/@mightpossibly) and compare yourself to others in the [Ranking](https://hivebuzz.me/ranking)_</sub>


**Check out our last posts:**
<table><tr><td><a href="/hive-122221/@hivebuzz/pud-202404"><img src="https://images.hive.blog/64x128/https://i.imgur.com/805FIIt.jpg"></a></td><td><a href="/hive-122221/@hivebuzz/pud-202404">Hive Power Up Day - April 1st 2024</a></td></tr></table>
properties (22)
authorhivebuzz
permlinknotify-mightpossibly-20240401t102308
categoryhive-167922
json_metadata{"image":["https://hivebuzz.me/notify.t6.png"]}
created2024-04-01 10:23:09
last_update2024-04-01 10:23:09
depth1
children0
last_payout2024-04-08 10:23:09
cashout_time1969-12-31 23:59:59
total_payout_value0.000 HBD
curator_payout_value0.000 HBD
pending_payout_value0.000 HBD
promoted0.000 HBD
body_length800
author_reputation367,907,014,297,979
root_title"Building a Hive Application as a Non-coder (AI use-case)"
beneficiaries[]
max_accepted_payout1,000,000.000 HBD
percent_hbd10,000
post_id132,506,839
net_rshares0
@hivebuzz ·
Congratulations @mightpossibly! You received a personal badge!

<table><tr><td>https://images.hive.blog/70x70/https://hivebuzz.me/badges/pud.s4.png?202404020006</td><td>You powered-up at least 500 HP on Hive Power Up Day! This entitles you to a level 4 badge<br>Participate in the next Power Up Day and try to power-up more HIVE to get a bigger Power-Bee.<br>May the Hive Power be with you!</td></tr></table>

<sub>_You can view your badges on [your board](https://hivebuzz.me/@mightpossibly) and compare yourself to others in the [Ranking](https://hivebuzz.me/ranking)_</sub>


**Check out our last posts:**
<table><tr><td><a href="/hive-122221/@hivebuzz/pum-202405"><img src="https://images.hive.blog/64x128/https://i.imgur.com/M9RD8KS.png"></a></td><td><a href="/hive-122221/@hivebuzz/pum-202405">Be ready for the May edition of the Hive Power Up Month!</a></td></tr></table>
properties (22)
authorhivebuzz
permlinknotify-mightpossibly-20240402t002436
categoryhive-167922
json_metadata{"image":["https://hivebuzz.me/notify.t6.png"]}
created2024-04-02 00:24:36
last_update2024-04-02 00:24:36
depth1
children0
last_payout2024-04-09 00:24:36
cashout_time1969-12-31 23:59:59
total_payout_value0.000 HBD
curator_payout_value0.000 HBD
pending_payout_value0.000 HBD
promoted0.000 HBD
body_length878
author_reputation367,907,014,297,979
root_title"Building a Hive Application as a Non-coder (AI use-case)"
beneficiaries[]
max_accepted_payout1,000,000.000 HBD
percent_hbd10,000
post_id132,526,298
net_rshares0
@old-guy-photos ·
$0.03
Wow! I know nothing about coding, buy it sounds like that AI is pretty amazing! I would say well done on your putting it all together as well!
👍  
properties (23)
authorold-guy-photos
permlinkre-mightpossibly-sau6xs
categoryhive-167922
json_metadata{"tags":["hive-167922"],"app":"peakd/2024.3.5"}
created2024-03-24 05:22:39
last_update2024-03-24 05:22:39
depth1
children1
last_payout2024-03-31 05:22:39
cashout_time1969-12-31 23:59:59
total_payout_value0.012 HBD
curator_payout_value0.013 HBD
pending_payout_value0.000 HBD
promoted0.000 HBD
body_length142
author_reputation590,345,974,002,038
root_title"Building a Hive Application as a Non-coder (AI use-case)"
beneficiaries[]
max_accepted_payout1,000,000.000 HBD
percent_hbd10,000
post_id132,286,671
net_rshares49,210,635,554
author_curate_reward""
vote details (1)
@mightpossibly ·
Thank you! It's not necessary to know coding at all to use AI. If you have no experience with interacting with LLM's this is a pretty complicated use-case though – you can use it for much simpler stuff too. For instance:

https://inleo.io/threads/view/mightpossibly/re-leothreads-jkjf3zek

Highly recommend giving it a go! It's experience worth having, however simple the use-case.
properties (22)
authormightpossibly
permlinkre-old-guy-photos-2gxpurwcu
categoryhive-167922
json_metadata{"app":"leothreads/0.3","format":"markdown","tags":["leofinance"],"canonical_url":"https://inleo.io/@mightpossibly/re-old-guy-photos-2gxpurwcu","links":["https://inleo.io/threads/view/mightpossibly/re-leothreads-jkjf3zek"],"images":[],"isPoll":false,"pollOptions":{},"dimensions":[]}
created2024-03-24 09:50:48
last_update2024-03-24 09:50:48
depth2
children0
last_payout2024-03-31 09:50:48
cashout_time1969-12-31 23:59:59
total_payout_value0.000 HBD
curator_payout_value0.000 HBD
pending_payout_value0.000 HBD
promoted0.000 HBD
body_length381
author_reputation44,258,565,333,528
root_title"Building a Hive Application as a Non-coder (AI use-case)"
beneficiaries[]
max_accepted_payout1,000,000.000 HBD
percent_hbd10,000
post_id132,289,928
net_rshares0
@shawnnft ·
$0.04
great job. I'm not sure if im missed reading it but does it autopost or are u copy pasting and posting it
👍  ,
properties (23)
authorshawnnft
permlinkre-mightpossibly-sat9gi
categoryhive-167922
json_metadata{"tags":["hive-167922"],"app":"peakd/2024.3.5"}
created2024-03-23 17:19:39
last_update2024-03-23 17:19:39
depth1
children2
last_payout2024-03-30 17:19:39
cashout_time1969-12-31 23:59:59
total_payout_value0.019 HBD
curator_payout_value0.019 HBD
pending_payout_value0.000 HBD
promoted0.000 HBD
body_length105
author_reputation287,315,297,298,253
root_title"Building a Hive Application as a Non-coder (AI use-case)"
beneficiaries[]
max_accepted_payout1,000,000.000 HBD
percent_hbd10,000
post_id132,275,766
net_rshares72,681,462,671
author_curate_reward""
vote details (2)
@mightpossibly ·
$0.02
In the case of the ones I do in the day to day, I post them manually. Regarding the 1700 videos described in this post, they are posted automatically by the script created in Problem 5.
👍  ,
properties (23)
authormightpossibly
permlinkre-shawnnft-2csqpvdvc
categoryhive-167922
json_metadata{"app":"leothreads/0.3","format":"markdown","tags":["leofinance"],"canonical_url":"https://inleo.io/@mightpossibly/re-shawnnft-2csqpvdvc","isPoll":false,"pollOptions":{},"dimensions":[]}
created2024-03-23 18:07:51
last_update2024-03-23 18:07:51
depth2
children1
last_payout2024-03-30 18:07:51
cashout_time1969-12-31 23:59:59
total_payout_value0.012 HBD
curator_payout_value0.012 HBD
pending_payout_value0.000 HBD
promoted0.000 HBD
body_length185
author_reputation44,258,565,333,528
root_title"Building a Hive Application as a Non-coder (AI use-case)"
beneficiaries[]
max_accepted_payout1,000,000.000 HBD
percent_hbd10,000
post_id132,276,676
net_rshares46,916,409,701
author_curate_reward""
vote details (2)
@shawnnft ·
i see. that makes sense. nah I figured the 1700 videos would be automatic. No way you would do all that on your own or would you :P.
👍  
properties (23)
authorshawnnft
permlinkre-mightpossibly-satepk
categoryhive-167922
json_metadata{"tags":["hive-167922"],"app":"peakd/2024.3.5"}
created2024-03-23 19:13:03
last_update2024-03-23 19:13:03
depth3
children0
last_payout2024-03-30 19:13:03
cashout_time1969-12-31 23:59:59
total_payout_value0.000 HBD
curator_payout_value0.000 HBD
pending_payout_value0.000 HBD
promoted0.000 HBD
body_length132
author_reputation287,315,297,298,253
root_title"Building a Hive Application as a Non-coder (AI use-case)"
beneficiaries[]
max_accepted_payout1,000,000.000 HBD
percent_hbd10,000
post_id132,277,854
net_rshares25,955,261,568
author_curate_reward""
vote details (1)
@stemsocial ·
re-mightpossibly-building-a-hive-application-as-a-noncoder-ai-usecase-20240329t181136482z
<div class='text-justify'> <div class='pull-left'>
 <img src='https://stem.openhive.network/images/stemsocialsupport7.png'> </div>

Thanks for your contribution to the <a href='/trending/hive-196387'>STEMsocial community</a>. Feel free to join us on <a href='https://discord.gg/9c7pKVD'>discord</a> to get to know the rest of us!

Please consider delegating to the @stemsocial account (85% of the curation rewards are returned).

You may also include @stemsocial as a beneficiary of the rewards of this post to get a stronger support.&nbsp;<br />&nbsp;<br />
</div>
properties (22)
authorstemsocial
permlinkre-mightpossibly-building-a-hive-application-as-a-noncoder-ai-usecase-20240329t181136482z
categoryhive-167922
json_metadata{"app":"STEMsocial"}
created2024-03-29 18:11:36
last_update2024-03-29 18:11:36
depth1
children0
last_payout2024-04-05 18:11:36
cashout_time1969-12-31 23:59:59
total_payout_value0.000 HBD
curator_payout_value0.000 HBD
pending_payout_value0.000 HBD
promoted0.000 HBD
body_length565
author_reputation22,460,408,920,472
root_title"Building a Hive Application as a Non-coder (AI use-case)"
beneficiaries[]
max_accepted_payout1,000,000.000 HBD
percent_hbd10,000
post_id132,437,460
net_rshares0