J.O.S.I.E.v4o / README.md
Isaak Carter Augustus
Update README.md
02162cc verified
|
raw
history blame
570 Bytes
metadata
license: apache-2.0
datasets:
  - HuggingFaceFW/fineweb
  - PleIAs/YouTube-Commons
  - allenai/WildChat-1M
language:
  - de
  - en
  - ja
  - fr
library_name: mlx
tags:
  - moe
  - multimodal
  - j.o.s.i.e.

This will be the repo for J.O.S.I.E.v4o

Like OpenAIs GPT-4o, it's natively Multimodal, based on the NExT-GPT combined with ROPE, RMS Normalisation, and MoE, parred with the GPT-4o Tokenizer from OpenAI. This is a future project and will take it's time.

Also I will make a UI application with that model too.

Further updates comming soon!!!