create account

Mixtral 8x7B, best open-source mixture of expert AI model tested, and how Mistral mocked the EU regulators. Full-transcript. by vikisecrets

View this thread on: hive.blogpeakd.comecency.com
· @vikisecrets · (edited)
$18.05
Mixtral 8x7B, best open-source mixture of expert AI model tested, and how Mistral mocked the EU regulators. Full-transcript.
Heute schauen wir uns das größere der beiden Open-Source-Sprachmodelle des erfolgreichen europäischen KI-Startups Mistral an.

Mixtral, man beachte das x in Mixtral, ist ein Mixture-Of-Experts (MoE) Modell mit 8 Paramtergruppen zu jeweils ungefähr 7B. Insgesamt kommt Mixtral so auf 46.7B Parameter.

Mixture-of-Experts ist ein Trick, mit dem man viele kleine Sprachmodelle zu einer größeren KI vereinen kann. Intuitiv kann man man sich das so vorstellen, dass jedes der kleinen Sprachmodelle ein Experte auf einem bestimmten Gebiet, zum Beispiel Coding, ist, und je nach Prompt verwendet die KI dann das Modell, das die Aufgabe am besten lösen kann.

Mixtral verwendet bei jedem Prompt zwei Experten und der Vorteil dieses Ansatzes ist, dass man weniger Ressourcen braucht, um eine Anfrage zu beantworten, was vor allem für lokale Open-Source-Modelle wichtig ist, da nicht jeder eine GPU-Farm von NVIDIA A100 zu Hause hat.

Auch GPT-4 ist Gerüchten zu Folge ein Mixture-of-Expert-Modell.

Performance-mäßig schneidet Mixtral deutlich besser ab als das kleinere Mistral 7B Modell. Laut MMLU-Test kommt es auf 70.6 Punkte und liegt damit bereits deutlich auf GPT-3.5 Niveau (70). GPT-4 kommt auf 86.5.

Und das alles lokal auf dem eigenen Computer. Im Test benötigt Mixtral bei mir circa 50 GB Ram (und 12 GB VRAM). Rechenzeit für einen Prompt ist circa 30 Sekunden bei kürzeren Antworten.

Und zum Abschluss noch ein cooles Fakt, wie Mistral die Regulierungsbehörden der EU auf die Schippe genommen hat. Mixtral wurde nämlich nicht wie sonst üblich zuerst auf einer zentralen Plattform (Huggingface) veröffentlicht, sondern auf Torrent. Dezentral. Der Link wurde sogar offiziell auf Twitter/X gepostet. Genial und auch ein Statement. Open-Source kann man nicht verbieten oder zensurieren.

Was sagt ihr dazu?

![opensourceai7.jpg](https://images.hive.blog/DQmaLKPGEdB8h4mRsiVPcoNF2NeUvhLdkodX6aLmmFLqpaz/open-source-ai-7.jpg)

*Open-Source LLMs. AI-generated illustration (Copilot Designer)*

## English

Today we look at the larger of the two open source language models from the successful European AI startup Mistral.

Mixtral, note the x in Mixtral, is a Mixture-Of-Experts (MoE) model with 8 parameter groups of approximately 7B each. This gives Mixtral a total of 46.7B parameters.

Mixture-of-Experts is a trick that can be used to combine many small language models into a larger AI. Intuitively, you can imagine that each of the small language models is an expert in a certain area, for example coding, and depending on the prompt, the AI then uses the model that can best solve the task.

Mixtral uses two experts at each prompt and the advantage of this approach is that you need fewer resources to answer a query, which is especially important for local open source models as not everyone has a GPU farm of NVIDIA A100 at home.

GPT-4 is also rumored to be a mixture-of-expert model.

In terms of performance, Mixtral performs significantly better than the smaller Mistral 7B model. According to the MMLU test, it achieves 70.6 points and is therefore already clearly at GPT-3.5 level (70). GPT-4 comes in at 86.5.

And all this locally on your own computer. In my test, Mixtral requires around 50 GB of RAM (and 12 GB of VRAM). The computing time for a prompt is around 30 seconds with shorter responses.

And finally, a cool fact about how Mistral has taken the EU regulatory authorities for a ride. Mixtral was not published on a central platform (Huggingface) first, as is usually the case with open-source models, but on Torrent. Decentralized. The link was even officially posted on Twitter/X. Awesome and also a statement. Open source cannot be banned or censored.

What do you think?

## Mixtral 8x7B Instruct v0.1

https://huggingface.co/TheBloke/Mixtral-8x7B-Instruct-v0.1-GGUF

https://mistral.ai/technology/#models

https://mistral.ai/news/mixtral-of-experts/

https://twitter.com/MistralAI/status/1733150512395038967

Quantization: 5 Bit K M mixtral-8x7b-instruct-v0.1.Q5_K_M.gguf
Parameters: simple-1
Zero Shot

## Interface (oobabooga/text-generation-webui)
https://github.com/oobabooga/text-generation-webui

## How good is Mixtral 8x7B? Full-Transcript

### Q

What is the capital city of Austria? Answer only the name of the city.

### A (Mixtral 8x7B)

Vienna

### Result

Correct.

### Q

Is Bitcoin a security or commodity?

### A (Mixtral 8x7B)

Bitcoin is considered a cryptocurrency, not a security or commodity, although its legal classification varies by country. In the United States, for example, the Commodity Futures Trading Commission (CFTC) classifies bitcoin as a commodity, while the Securities and Exchange Commission (SEC) has yet to make a definitive statement. However, it's important to note that this classification may change as regulatory frameworks evolve. Always consult with a legal expert for advice tailored to your specific situation.

### Result

Correct.

### Q

What is the hive blockchain?

### A (Mixtral 8x7B)

Hive Blockchain is a decentralized information sharing platform that utilizes blockchain technology and tokenized incentives to encourage users to contribute, curate, and consume content. It's built on the Delegated Proof of Stake (DPoS) consensus algorithm, which allows for faster transaction speeds and energy efficiency compared to other consensus mechanisms. Hive Blockchain was forked from the Steem Blockchain in 2020, and it supports various decentralized applications (DApps) for social media, gaming, and more. HIVE is the native cryptocurrency of the Hive Blockchain, used for rewarding content creators, curators, and node operators.

### Result

Correct.

### Q

What is 2^8?

### A (Mixtral 8x7B)

2^8 equals 256. Would you like me to calculate something else for you?

### Result

Correct.

### Q

Can you compute the 8th root of 2^8?

### A (Mixtral 8x7B)

Absolutely! The 8th root of (2^8) is equal to 2. Since (2^8) equals 256, taking the 8th root of it gives us back 2.

### Result

Correct.

### Q

Solve the puzzle: When I was 6 my sister was half my age, now i'm 70. How old's my sister? Let's think step by step.

### A (Mixtral 8x7B)

Sure! Let's solve this step by step as requested.

1) At 6 years old, your sister was half your age, which means she was 6/2 = 3 years old.

2) Now that you are 70 years old, your sister would have grown by 70-3 = 67 years.

3) Therefore, your sister's current age is 3 (her age when you were 6) + 67 = 70 years old as well.

### Result

Incorrect. Step 1 is corrrect though.

### Q (Bar test)

Answer the following exam question based on US law:

A father lived with his son, who was an alcoholic. When drunk, the son often became violent and physically abused his father. As a result, the father always lived in fear. One night, the father heard his son on the front stoop making loud obscene remarks. The father was certain that his son was drunk and was terrified that he would be physically beaten again. In his fear, he bolted the front door and took out a revolver. When the son discovered that the door was bolted, he kicked it down. As the son burst through the front door, his father shot him four times in the chest, killing him. In fact, the son was not under the influence of alcohol or any drug and did not intend to harm his father. At trial, the father presented the above facts and asked the judge to instruct the jury on self-defense. How should the judge instruct the jury with respect to self-defense?

(A) Give the self-defense instruction, because it expresses the defense’s theory of the case.
(B) Give the self-defense instruction, because the evidence is sufficient to raise the defense.
(C) Deny the self-defense instruction, because the father was not in imminent danger from his son.
(D) Deny the self-defense instruction, because the father used excessive force

What is the correct answer?

### A (Mixtral 8x7B)

This is indeed a complex legal question that involves the interpretation of US law, specifically regarding self-defense. It's important to note that I'm an AI language model and not a lawyer, so my response should not be used as legal advice.

That being said, let's break down the options:

(A) This option suggests giving the self-defense instruction because it expresses the defense's theory of the case. This could be a valid point, as it's the defense's role to present their interpretation of the events. However, the instruction should also align with the legal definition of self-defense.

(B) This option suggests giving the self-defense instruction because the evidence is sufficient to raise the defense. Given the father's belief that his son was drunk and violent, coupled with his history of violence, it could be argued that this meets the requirement of "reasonable belief" of imminent danger.

(C) This option suggests denying the self-defense instruction because the father was not in imminent danger from his son. This could be debatable, as the father believed he was in danger due to his son's history of violence when drunk. However, it turns out that the son was not drunk at the time.

(D) This option suggests denying the self-defense instruction because the father used excessive force. This could also be debatable. While four shots in the chest could be considered excessive, the father might argue that he feared for his life given his son's history of violence.

Based on this analysis, it seems that option B could be the most appropriate response, as it takes into account both the father's reasonable belief of imminent danger and the evidence presented in the case. However, this is a complex legal question that would likely require interpretation by a legal professional or a judge.

### Result

Correct. I am not a lawyer, I assume it could be justified in the US by the Stand-your-ground principle.

### Q (GMAT test)

Select the best of the answer choices given.

The interview is an essential part of a successful hiring program because, with it job applicants who have personalities that are unsuited to the requirements of the job will be eliminated from consideration.

This argument logically depends on which of the following assumptions?

A: A Hiring program will be successful if it includes interviews.

B: The interview is a more important part of a successful hiring program than is the development of a job description.

C: Interviewers can accurately identify applicants whose personalities are unsuited to the requirements of the job.

D: The only purpose of an interview is to evaluate whether job applicants’ personalities are suited to the requirements of the job.

E: The fit of job applicants’ personalities to the requirements of the job was once the most important factor in making hiring decisions.

### A (Mixtral 8x7B)

The argument logically depends on assumption C: Interviewers can accurately identify applicants whose personalities are unsuited to the requirements of the job.

### Result

Corrrect.

### Q

Write a hello world program in Commodore 64 Basic that flickers the screen.

### A (Mixtral 8x7B)

Sure! Here's a Hello World program in Commodore 64 Basic that alternates between two colors on the screen for flickering effect:

```
10 PRINT "HELLO WORLD"
20 POKE 53280,PEEK(53280)+16
30 GOTO 20
```

When you run this program on your Commodore 64, it will display "HELLO WORLD" in alternating colors on the screen due to the loop in lines 20-30 that changes the color attribute of the screen border. Note that pressing the "STOP" key will halt this infinite loop.

### Result

Not bad, but adding 16 to the border color results in "illegal quantity error in line 20" because the C64 had only 16 colors.

## Conclusion

Once again super impressed, got most of the tests right, also coding is significantly better than with the smaller Mistral model. How long for open-source to reach GPT-4 levels?

*Live your Secrets and Hive Prosper* 🍯

xx Viki @vikisecrets

Posted using [STEMGeeks](https://stemgeeks.net/@vikisecrets/mixtral-8x7b-best-open-source-mixture-of-expert-ai-model-and-how-mistral-mocked-the-eu-regulators-full-transcript)
👍  , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , and 313 others
properties (23)
authorvikisecrets
permlinkmixtral-8x7b-best-open-source-mixture-of-expert-ai-model-and-how-mistral-mocked-the-eu-regulators-full-transcript
categoryhive-163521
json_metadata{"tags":["stem","mistral","chatgpt","artificialintelligence","deutsch","proofofbrain","hive-engine","leofinance","ai","hive-163521"],"users":["vikisecrets"],"image":["https://images.hive.blog/DQmaLKPGEdB8h4mRsiVPcoNF2NeUvhLdkodX6aLmmFLqpaz/open-source-ai-7.jpg"],"links":["https://huggingface.co/TheBloke/Mixtral-8x7B-Instruct-v0.1-GGUF"],"format":"markdown","canonical_url":"https://stemgeeks.net/@vikisecrets/mixtral-8x7b-best-open-source-mixture-of-expert-ai-model-and-how-mistral-mocked-the-eu-regulators-full-transcript","app":"stemgeeks/1.2.0"}
created2024-03-28 10:16:09
last_update2024-03-28 10:26:30
depth0
children23
last_payout2024-04-04 10:16:09
cashout_time1969-12-31 23:59:59
total_payout_value9.057 HBD
curator_payout_value8.994 HBD
pending_payout_value0.000 HBD
promoted0.000 HBD
body_length12,089
author_reputation1,196,288,653,259,707
root_title"Mixtral 8x7B, best open-source mixture of expert AI model tested, and how Mistral mocked the EU regulators. Full-transcript."
beneficiaries[]
max_accepted_payout1,000,000.000 HBD
percent_hbd10,000
post_id132,402,094
net_rshares37,234,872,955,079
author_curate_reward""
vote details (377)
@apshamilton ·
$0.10
I'm running it and it is OK but I asked it which advanced country has the highest fertility and it got it wrong. It said France at TFR of 1.8. When the answer is of course Israel at 2.9.


![image.png](https://files.peakd.com/file/peakd-hive/apshamilton/23t79QiJaD6ZxoEw1za2nVMz7VZLWAwu5caJ4TYFbV6umwy2i4HZn6KfKnE7FGkt2muzT.png)
👍  
properties (23)
authorapshamilton
permlinkre-vikisecrets-sb2ib9
categoryhive-163521
json_metadata{"tags":["hive-163521"],"app":"peakd/2024.3.6"}
created2024-03-28 17:09:09
last_update2024-03-28 17:09:09
depth1
children2
last_payout2024-04-04 17:09:09
cashout_time1969-12-31 23:59:59
total_payout_value0.052 HBD
curator_payout_value0.052 HBD
pending_payout_value0.000 HBD
promoted0.000 HBD
body_length329
author_reputation212,404,184,641,750
root_title"Mixtral 8x7B, best open-source mixture of expert AI model tested, and how Mistral mocked the EU regulators. Full-transcript."
beneficiaries[]
max_accepted_payout1,000,000.000 HBD
percent_hbd10,000
post_id132,410,921
net_rshares214,964,752,921
author_curate_reward""
vote details (1)
@vikisecrets ·
haha, maybe the model is biased towards France where it originates from 😂
properties (22)
authorvikisecrets
permlinkre-apshamilton-sb2ju5
categoryhive-163521
json_metadata{"tags":["hive-163521"],"app":"peakd/2024.3.6"}
created2024-03-28 17:42:00
last_update2024-03-28 17:42:00
depth2
children1
last_payout2024-04-04 17:42:00
cashout_time1969-12-31 23:59:59
total_payout_value0.000 HBD
curator_payout_value0.000 HBD
pending_payout_value0.000 HBD
promoted0.000 HBD
body_length73
author_reputation1,196,288,653,259,707
root_title"Mixtral 8x7B, best open-source mixture of expert AI model tested, and how Mistral mocked the EU regulators. Full-transcript."
beneficiaries[]
max_accepted_payout1,000,000.000 HBD
percent_hbd10,000
post_id132,411,534
net_rshares0
@apshamilton ·
$0.11
LOL. Maybe that explains it!
👍  
properties (23)
authorapshamilton
permlinkre-vikisecrets-sb3hv1
categoryhive-163521
json_metadata{"tags":["hive-163521"],"app":"peakd/2024.3.6"}
created2024-03-29 05:57:03
last_update2024-03-29 05:57:03
depth3
children0
last_payout2024-04-05 05:57:03
cashout_time1969-12-31 23:59:59
total_payout_value0.055 HBD
curator_payout_value0.055 HBD
pending_payout_value0.000 HBD
promoted0.000 HBD
body_length28
author_reputation212,404,184,641,750
root_title"Mixtral 8x7B, best open-source mixture of expert AI model tested, and how Mistral mocked the EU regulators. Full-transcript."
beneficiaries[]
max_accepted_payout1,000,000.000 HBD
percent_hbd10,000
post_id132,423,786
net_rshares232,910,405,602
author_curate_reward""
vote details (1)
@chinito ·
$0.11
Lamas unite! :)
👍  
properties (23)
authorchinito
permlinkre-vikisecrets-sb2smt
categoryhive-163521
json_metadata{"tags":["hive-163521"],"app":"peakd/2024.3.6"}
created2024-03-28 20:52:06
last_update2024-03-28 20:52:06
depth1
children0
last_payout2024-04-04 20:52:06
cashout_time1969-12-31 23:59:59
total_payout_value0.054 HBD
curator_payout_value0.053 HBD
pending_payout_value0.000 HBD
promoted0.000 HBD
body_length15
author_reputation189,016,225,553,791
root_title"Mixtral 8x7B, best open-source mixture of expert AI model tested, and how Mistral mocked the EU regulators. Full-transcript."
beneficiaries[]
max_accepted_payout1,000,000.000 HBD
percent_hbd10,000
post_id132,415,114
net_rshares219,690,953,483
author_curate_reward""
vote details (1)
@jemima2001 ·
$0.11
This information is really great, Mistral is doing great. 
👍  
properties (23)
authorjemima2001
permlinkre-vikisecrets-2024329t91616538z
categoryhive-163521
json_metadata{"type":"comment","tags":["hive-163521","stem","mistral","chatgpt","artificialintelligence","deutsch","proofofbrain","hive-engine","leofinance","ai","hive-163521"],"app":"ecency/3.0.46-mobile","format":"markdown+html"}
created2024-03-29 08:16:21
last_update2024-03-29 08:16:21
depth1
children0
last_payout2024-04-05 08:16:21
cashout_time1969-12-31 23:59:59
total_payout_value0.054 HBD
curator_payout_value0.055 HBD
pending_payout_value0.000 HBD
promoted0.000 HBD
body_length58
author_reputation20,184,390,496,473
root_title"Mixtral 8x7B, best open-source mixture of expert AI model tested, and how Mistral mocked the EU regulators. Full-transcript."
beneficiaries[]
max_accepted_payout1,000,000.000 HBD
percent_hbd10,000
post_id132,425,699
net_rshares227,805,765,164
author_curate_reward""
vote details (1)
@jfang003 ·
$0.11
It sounds interesting but the resources needed to run these is quite large. On a normal computer, it's quite taxing on the system as you won't be able to do much of anything else.

It does look like a decent Ai.
👍  
properties (23)
authorjfang003
permlinkre-vikisecrets-2gcpcbbnf
categoryhive-163521
json_metadata{"app":"leothreads/0.3","format":"markdown","tags":["leofinance"],"canonical_url":"https://inleo.io/@jfang003/re-vikisecrets-2gcpcbbnf","isPoll":false,"pollOptions":{},"dimensions":[]}
created2024-03-29 06:43:51
last_update2024-03-29 06:43:51
depth1
children0
last_payout2024-04-05 06:43:51
cashout_time1969-12-31 23:59:59
total_payout_value0.054 HBD
curator_payout_value0.054 HBD
pending_payout_value0.000 HBD
promoted0.000 HBD
body_length211
author_reputation637,076,248,457,939
root_title"Mixtral 8x7B, best open-source mixture of expert AI model tested, and how Mistral mocked the EU regulators. Full-transcript."
beneficiaries[]
max_accepted_payout1,000,000.000 HBD
percent_hbd10,000
post_id132,424,343
net_rshares227,357,114,876
author_curate_reward""
vote details (1)
@lichtblick ·
$0.10
Mega beeindruckend. Schlau die Regulierung einfach zu umgehen ;-)
👍  
properties (23)
authorlichtblick
permlinksb22i5
categoryhive-163521
json_metadata{"app":"hiveblog/0.1"}
created2024-03-28 11:27:42
last_update2024-03-28 11:27:42
depth1
children0
last_payout2024-04-04 11:27:42
cashout_time1969-12-31 23:59:59
total_payout_value0.051 HBD
curator_payout_value0.051 HBD
pending_payout_value0.000 HBD
promoted0.000 HBD
body_length65
author_reputation735,264,329,637,068
root_title"Mixtral 8x7B, best open-source mixture of expert AI model tested, and how Mistral mocked the EU regulators. Full-transcript."
beneficiaries[]
max_accepted_payout1,000,000.000 HBD
percent_hbd10,000
post_id132,403,205
net_rshares214,070,484,765
author_curate_reward""
vote details (1)
@memess ·
Lama nft
properties (22)
authormemess
permlinkre-vikisecrets-2024328t143441460z
categoryhive-163521
json_metadata{"type":"comment","tags":["hive-163521","stem","mistral","chatgpt","artificialintelligence","deutsch","proofofbrain","hive-engine","leofinance","ai","hive-163521"],"app":"ecency/3.0.46-mobile","format":"markdown+html"}
created2024-03-28 13:34:42
last_update2024-03-28 13:34:42
depth1
children0
last_payout2024-04-04 13:34:42
cashout_time1969-12-31 23:59:59
total_payout_value0.000 HBD
curator_payout_value0.000 HBD
pending_payout_value0.000 HBD
promoted0.000 HBD
body_length8
author_reputation22,869,845,823,245
root_title"Mixtral 8x7B, best open-source mixture of expert AI model tested, and how Mistral mocked the EU regulators. Full-transcript."
beneficiaries[]
max_accepted_payout1,000,000.000 HBD
percent_hbd10,000
post_id132,405,698
net_rshares0
@momins ·
$0.10
AI performed very well. Great picture
👍  
properties (23)
authormomins
permlinkre-vikisecrets-2024328t22349650z
categoryhive-163521
json_metadata{"tags":["stem","mistral","chatgpt","artificialintelligence","deutsch","proofofbrain","hive-engine","leofinance","ai","hive-163521"],"app":"ecency/3.1.1-vision","format":"markdown+html"}
created2024-03-28 16:03:51
last_update2024-03-28 16:03:51
depth1
children0
last_payout2024-04-04 16:03:51
cashout_time1969-12-31 23:59:59
total_payout_value0.051 HBD
curator_payout_value0.052 HBD
pending_payout_value0.000 HBD
promoted0.000 HBD
body_length37
author_reputation57,220,269,818,923
root_title"Mixtral 8x7B, best open-source mixture of expert AI model tested, and how Mistral mocked the EU regulators. Full-transcript."
beneficiaries[]
max_accepted_payout1,000,000.000 HBD
percent_hbd10,000
post_id132,409,386
net_rshares213,524,696,851
author_curate_reward""
vote details (1)
@mxm0unite ·
$0.11
It's still quite complex. Do any students use AI to do homework?
👍  
properties (23)
authormxm0unite
permlinkre-vikisecrets-sb39jc
categoryhive-163521
json_metadata{"tags":["hive-163521"],"app":"peakd/2024.3.6"}
created2024-03-29 02:57:12
last_update2024-03-29 02:57:12
depth1
children0
last_payout2024-04-05 02:57:12
cashout_time1969-12-31 23:59:59
total_payout_value0.056 HBD
curator_payout_value0.056 HBD
pending_payout_value0.000 HBD
promoted0.000 HBD
body_length64
author_reputation14,231,031,294,527
root_title"Mixtral 8x7B, best open-source mixture of expert AI model tested, and how Mistral mocked the EU regulators. Full-transcript."
beneficiaries[]
max_accepted_payout1,000,000.000 HBD
percent_hbd10,000
post_id132,421,567
net_rshares227,059,658,974
author_curate_reward""
vote details (1)
@orionvk ·
$0.10
"Mixtral wurde nämlich nicht wie sonst üblich zuerst auf einer zentralen Plattform (Huggingface) veröffentlicht, sondern auf Torrent. Dezentral."

Geil! 😁
👍  
properties (23)
authororionvk
permlinkre-vikisecrets-sb2dmy
categoryhive-163521
json_metadata{"tags":["hive-163521"],"app":"peakd/2024.3.6"}
created2024-03-28 15:28:12
last_update2024-03-28 15:28:12
depth1
children1
last_payout2024-04-04 15:28:12
cashout_time1969-12-31 23:59:59
total_payout_value0.051 HBD
curator_payout_value0.052 HBD
pending_payout_value0.000 HBD
promoted0.000 HBD
body_length154
author_reputation330,842,582,932,101
root_title"Mixtral 8x7B, best open-source mixture of expert AI model tested, and how Mistral mocked the EU regulators. Full-transcript."
beneficiaries[]
max_accepted_payout1,000,000.000 HBD
percent_hbd10,000
post_id132,408,397
net_rshares213,949,165,991
author_curate_reward""
vote details (1)
@vikisecrets ·
$0.03
Das ist echt geil, auf huggingface oder github könnten die KI-Modelle jederzeit wieder gelöscht werden.
👍  
properties (23)
authorvikisecrets
permlinkre-orionvk-sb2fod
categoryhive-163521
json_metadata{"tags":["hive-163521"],"app":"peakd/2024.3.6"}
created2024-03-28 16:12:15
last_update2024-03-28 16:12:15
depth2
children0
last_payout2024-04-04 16:12:15
cashout_time1969-12-31 23:59:59
total_payout_value0.015 HBD
curator_payout_value0.015 HBD
pending_payout_value0.000 HBD
promoted0.000 HBD
body_length103
author_reputation1,196,288,653,259,707
root_title"Mixtral 8x7B, best open-source mixture of expert AI model tested, and how Mistral mocked the EU regulators. Full-transcript."
beneficiaries[]
max_accepted_payout1,000,000.000 HBD
percent_hbd10,000
post_id132,409,559
net_rshares64,235,573,163
author_curate_reward""
vote details (1)
@pizzabot ·
<center>PIZZA!


$PIZZA slices delivered:
@thehockeyfan-at<sub>(1/5)</sub> tipped @vikisecrets 


</center>
properties (22)
authorpizzabot
permlinkre-mixtral-8x7b-best-open-source-mixture-of-expert-ai-model-and-how-mistral-mocked-the-eu-regulators-full-transcript-20240328t145707z
categoryhive-163521
json_metadata"{"app": "pizzabot"}"
created2024-03-28 14:57:06
last_update2024-03-28 14:57:06
depth1
children0
last_payout2024-04-04 14:57:06
cashout_time1969-12-31 23:59:59
total_payout_value0.000 HBD
curator_payout_value0.000 HBD
pending_payout_value0.000 HBD
promoted0.000 HBD
body_length107
author_reputation7,514,433,425,255
root_title"Mixtral 8x7B, best open-source mixture of expert AI model tested, and how Mistral mocked the EU regulators. Full-transcript."
beneficiaries[]
max_accepted_payout1,000,000.000 HBD
percent_hbd10,000
post_id132,407,713
net_rshares0
@precab ·
The US are really doing a great job in regulating work 
properties (22)
authorprecab
permlinkre-vikisecrets-sb4tul
categoryhive-163521
json_metadata{"tags":["hive-163521"],"app":"peakd/2024.3.6"}
created2024-03-29 23:13:39
last_update2024-03-29 23:13:39
depth1
children0
last_payout2024-04-05 23:13:39
cashout_time1969-12-31 23:59:59
total_payout_value0.000 HBD
curator_payout_value0.000 HBD
pending_payout_value0.000 HBD
promoted0.000 HBD
body_length55
author_reputation18,826,690,046,740
root_title"Mixtral 8x7B, best open-source mixture of expert AI model tested, and how Mistral mocked the EU regulators. Full-transcript."
beneficiaries[]
max_accepted_payout1,000,000.000 HBD
percent_hbd10,000
post_id132,444,899
net_rshares0
@rafzat ·
AI is doing very good
Nice one!
properties (22)
authorrafzat
permlinkre-vikisecrets-2024328t18222279z
categoryhive-163521
json_metadata{"type":"comment","tags":["hive-163521","stem","mistral","chatgpt","artificialintelligence","deutsch","proofofbrain","hive-engine","leofinance","ai","hive-163521"],"app":"ecency/3.0.46-mobile","format":"markdown+html"}
created2024-03-28 17:22:03
last_update2024-03-28 17:22:03
depth1
children0
last_payout2024-04-04 17:22:03
cashout_time1969-12-31 23:59:59
total_payout_value0.000 HBD
curator_payout_value0.000 HBD
pending_payout_value0.000 HBD
promoted0.000 HBD
body_length31
author_reputation183,560,271,702,716
root_title"Mixtral 8x7B, best open-source mixture of expert AI model tested, and how Mistral mocked the EU regulators. Full-transcript."
beneficiaries[]
max_accepted_payout1,000,000.000 HBD
percent_hbd10,000
post_id132,411,198
net_rshares0
@stayoutoftherz ·
$0.13
Beeindruckend, aber bei der juristischen Frage wurde nur US-Recht zur Anwendung gebracht, bei uns wäre das nie im Leben Notwehr, schon allein wegen der 4 Schüsse. Falls es ein europäisches Modell ist, warum die US-Tests?
👍  ,
properties (23)
authorstayoutoftherz
permlinkre-vikisecrets-sb20if
categoryhive-163521
json_metadata{"tags":["hive-163521"],"app":"peakd/2024.3.6"}
created2024-03-28 10:44:42
last_update2024-03-28 10:44:42
depth1
children2
last_payout2024-04-04 10:44:42
cashout_time1969-12-31 23:59:59
total_payout_value0.064 HBD
curator_payout_value0.065 HBD
pending_payout_value0.000 HBD
promoted0.000 HBD
body_length220
author_reputation1,321,331,965,384,080
root_title"Mixtral 8x7B, best open-source mixture of expert AI model tested, and how Mistral mocked the EU regulators. Full-transcript."
beneficiaries[]
max_accepted_payout1,000,000.000 HBD
percent_hbd10,000
post_id132,402,555
net_rshares270,115,255,909
author_curate_reward""
vote details (2)
@vikisecrets · (edited)
Ganz sicher bin ich mir auch nicht, ob das in den USA so durchgehen würde, ist eine Standard Bar-Testfrage aus den USA. Verwende dieselben Fragen, um die Modelle vergleichen zu können. US-Notwehr dürfte aber weiter gefasst sein als in der EU.
properties (22)
authorvikisecrets
permlinkre-stayoutoftherz-sb240x
categoryhive-163521
json_metadata{"tags":"hive-163521"}
created2024-03-28 12:00:33
last_update2024-03-28 12:02:24
depth2
children1
last_payout2024-04-04 12:00:33
cashout_time1969-12-31 23:59:59
total_payout_value0.000 HBD
curator_payout_value0.000 HBD
pending_payout_value0.000 HBD
promoted0.000 HBD
body_length242
author_reputation1,196,288,653,259,707
root_title"Mixtral 8x7B, best open-source mixture of expert AI model tested, and how Mistral mocked the EU regulators. Full-transcript."
beneficiaries[]
max_accepted_payout1,000,000.000 HBD
percent_hbd10,000
post_id132,403,934
net_rshares0
@stayoutoftherz ·
$0.10
Das meine ich ja, in den USA wäre die Antwort richtig (da bin ich mir 100% sicher, ich kenn die Notwehrrechte), aber nicht bei uns (zumindest in D, Ö). Warum bei den Testfragen sich nur an die USA richten, wir leben ja hier, das war die eigentliche Intention meines Comments.
👍  
properties (23)
authorstayoutoftherz
permlinkre-vikisecrets-sb24hg
categoryhive-163521
json_metadata{"tags":["hive-163521"],"app":"peakd/2024.3.6"}
created2024-03-28 12:10:30
last_update2024-03-28 12:10:30
depth3
children0
last_payout2024-04-04 12:10:30
cashout_time1969-12-31 23:59:59
total_payout_value0.052 HBD
curator_payout_value0.052 HBD
pending_payout_value0.000 HBD
promoted0.000 HBD
body_length275
author_reputation1,321,331,965,384,080
root_title"Mixtral 8x7B, best open-source mixture of expert AI model tested, and how Mistral mocked the EU regulators. Full-transcript."
beneficiaries[]
max_accepted_payout1,000,000.000 HBD
percent_hbd10,000
post_id132,404,143
net_rshares215,890,957,086
author_curate_reward""
vote details (1)
@stemsocial ·
re-vikisecrets-mixtral-8x7b-best-open-source-mixture-of-expert-ai-model-and-how-mistral-mocked-the-eu-regulators-full-transcript-20240329t182217986z
<div class='text-justify'> <div class='pull-left'>
 <img src='https://stem.openhive.network/images/stemsocialsupport7.png'> </div>

Thanks for your contribution to the <a href='/trending/hive-196387'>STEMsocial community</a>. Feel free to join us on <a href='https://discord.gg/9c7pKVD'>discord</a> to get to know the rest of us!

Please consider delegating to the @stemsocial account (85% of the curation rewards are returned).

You may also include @stemsocial as a beneficiary of the rewards of this post to get a stronger support.&nbsp;<br />&nbsp;<br />
</div>
properties (22)
authorstemsocial
permlinkre-vikisecrets-mixtral-8x7b-best-open-source-mixture-of-expert-ai-model-and-how-mistral-mocked-the-eu-regulators-full-transcript-20240329t182217986z
categoryhive-163521
json_metadata{"app":"STEMsocial"}
created2024-03-29 18:22:18
last_update2024-03-29 18:22:18
depth1
children0
last_payout2024-04-05 18:22:18
cashout_time1969-12-31 23:59:59
total_payout_value0.000 HBD
curator_payout_value0.000 HBD
pending_payout_value0.000 HBD
promoted0.000 HBD
body_length565
author_reputation22,915,413,852,146
root_title"Mixtral 8x7B, best open-source mixture of expert AI model tested, and how Mistral mocked the EU regulators. Full-transcript."
beneficiaries[]
max_accepted_payout1,000,000.000 HBD
percent_hbd10,000
post_id132,437,663
net_rshares0
@thebighigg ·
$0.11
That is pretty cool! And all local on your PC! It may take awhile but it's decentralized which is a huge plus in my book!
👍  
properties (23)
authorthebighigg
permlinkre-vikisecrets-2024328t135942556z
categoryhive-163521
json_metadata{"tags":["stem","mistral","chatgpt","artificialintelligence","deutsch","proofofbrain","hive-engine","leofinance","ai","hive-163521"],"app":"ecency/3.1.1-vision","format":"markdown+html"}
created2024-03-28 19:59:42
last_update2024-03-28 19:59:42
depth1
children0
last_payout2024-04-04 19:59:42
cashout_time1969-12-31 23:59:59
total_payout_value0.054 HBD
curator_payout_value0.053 HBD
pending_payout_value0.000 HBD
promoted0.000 HBD
body_length121
author_reputation476,315,416,869,817
root_title"Mixtral 8x7B, best open-source mixture of expert AI model tested, and how Mistral mocked the EU regulators. Full-transcript."
beneficiaries[]
max_accepted_payout1,000,000.000 HBD
percent_hbd10,000
post_id132,414,013
net_rshares220,129,547,983
author_curate_reward""
vote details (1)
@thehockeyfan-at ·
!PIZZA
properties (22)
authorthehockeyfan-at
permlinkre-vikisecrets-sb2c6i
categoryhive-163521
json_metadata{"tags":["hive-163521"],"app":"peakd/2024.3.6"}
created2024-03-28 14:56:42
last_update2024-03-28 14:56:42
depth1
children0
last_payout2024-04-04 14:56:42
cashout_time1969-12-31 23:59:59
total_payout_value0.000 HBD
curator_payout_value0.000 HBD
pending_payout_value0.000 HBD
promoted0.000 HBD
body_length6
author_reputation97,594,941,595,448
root_title"Mixtral 8x7B, best open-source mixture of expert AI model tested, and how Mistral mocked the EU regulators. Full-transcript."
beneficiaries[]
max_accepted_payout1,000,000.000 HBD
percent_hbd10,000
post_id132,407,705
net_rshares0
@tsnaks ·
$0.10
Mistral seems to be doing great
👍  
properties (23)
authortsnaks
permlinkre-vikisecrets-sb2au1
categoryhive-163521
json_metadata{"tags":["hive-163521"],"app":"peakd/2024.3.6"}
created2024-03-28 14:27:39
last_update2024-03-28 14:27:39
depth1
children0
last_payout2024-04-04 14:27:39
cashout_time1969-12-31 23:59:59
total_payout_value0.051 HBD
curator_payout_value0.052 HBD
pending_payout_value0.000 HBD
promoted0.000 HBD
body_length31
author_reputation64,931,861,784,595
root_title"Mixtral 8x7B, best open-source mixture of expert AI model tested, and how Mistral mocked the EU regulators. Full-transcript."
beneficiaries[]
max_accepted_payout1,000,000.000 HBD
percent_hbd10,000
post_id132,406,945
net_rshares214,374,485,771
author_curate_reward""
vote details (1)
@youm ·
Hello, my brother. Unfortunately, I see many people admiring the high degree of artificial intelligence, but whoever designed the artificial intelligence is the human being. If just a fingernail designed the human being.
properties (22)
authoryoum
permlinksb28vh
categoryhive-163521
json_metadata{"app":"hiveblog/0.1"}
created2024-03-28 13:45:21
last_update2024-03-28 13:45:21
depth1
children0
last_payout2024-04-04 13:45:21
cashout_time1969-12-31 23:59:59
total_payout_value0.000 HBD
curator_payout_value0.000 HBD
pending_payout_value0.000 HBD
promoted0.000 HBD
body_length220
author_reputation518,285,278,218
root_title"Mixtral 8x7B, best open-source mixture of expert AI model tested, and how Mistral mocked the EU regulators. Full-transcript."
beneficiaries[]
max_accepted_payout1,000,000.000 HBD
percent_hbd10,000
post_id132,405,993
net_rshares0