Burn Linux Live CD ISO into your USB drive. then boot to it.
Then, install grub-efi-amd
or grub-efi-ia32
based on your system specification.
sudo apt install grub-efi-amd
#include <stdlib.h> | |
#include <stdbool.h> | |
#include <tgmath.h> | |
#define max(x,y) ((x>y)?x:y) | |
#define half __fp16 | |
void E_(int* data0) { | |
int val0 = data0[0]; | |
data0[0] = (val0+1); | |
} |
An guide how to activate Windows 11 Pro for free
Because you will get some more features like an Bitlocker and host your device as an External Desktop which can be accessed through the internet
The answer is yes! You can switch from almost any edition to Pro completely for free!
People which already have Pro, but not activated, can skip to this step.
What you first need to do is open CMD (Command Prompt) as Administrator using this keyboard key:
#include <sys/ioctl.h> | |
#include <stdio.h> | |
#include <unistd.h> | |
int | |
main(void) { | |
struct winsize ws; | |
ioctl(STDIN_FILENO, TIOCGWINSZ, &ws); | |
printf ("lines %d\n", ws.ws_row); |
scph5500.bin 26-Aug-2018 20:47 512.0K
scph5501.bin 26-Aug-2018 20:47 512.0K
scph5502.bin 26-Aug-2018 20:47 512.0K
import express from 'express'; | |
import { OpenAI } from 'openai'; | |
const app = express(); | |
const openai = new OpenAI({ | |
apiKey: 'API_KEY' | |
}); | |
app.use(express.json()); |
// | |
// _oo0oo_ | |
// o8888888o | |
// 88" . "88 | |
// (| -_- |) | |
// 0\ = /0 | |
// ___/`---'\___ | |
// .' \\| |// '. | |
// / \\||| : |||// \ | |
// / _||||| -:- |||||- \ |
The problem with large language models is that you can’t run these locally on your laptop. Thanks to Georgi Gerganov and his llama.cpp project, it is now possible to run Meta’s LLaMA on a single computer without a dedicated GPU.
There are multiple steps involved in running LLaMA locally on a M1 Mac after downloading the model weights.
Whenever I want to create pull requests to a repo that I don't have write access to, I: