You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I did a small experiment after watching your tutorial and the tutorial by Brainxyz.
the idea is to convert each token (a word in my case) into a sin signal.
I take a context_length word token and I sum up their signal (I add shift to represent position)
then I let the neural network predict the next signal.
I used Fourier transform to get the predicted token which is the max frequency in may case.
Hope this code will help for something :)
importtorchimporttorch.nnasnnimporttorch.nn.functionalasFd_model=1024classMyModel(nn.Module):
def__init__(self, d_model):
super().__init__()
self.d_model=d_modelself.w1=nn.Linear(d_model, d_model)
defforward(self, inputs):
t=torch.linspace(0, 1, self.d_model).requires_grad_(False)
signal=torch.zeros(self.d_model)
forposition, inputinenumerate(inputs):
p=torch.log(1+torch.tensor(position).requires_grad_(False))
signal+=torch.sin(2*torch.pi*input* (t+p))
signal=signal.requires_grad_(False)
x=self.w1(signal)
returnxtext="""In a quiet town where whispers playLives a creature night and dayA feline spirit soft and slyUnderneath the moonlit skyWith eyes like orbs of gleaming goldStories untold ancient and oldPaws that tread on silent groundIn their steps a mystery foundWhiskers twitch in the gentle breezeDancing lightly among the treesEars that listen to the night's songIn a world where they belongFur as soft as the morning's dewIn shades of black white or blueThey roam the streets without a careGrace in each step light as airIn gardens lush and fields wideTheir elegant forms do glideMasters of the shadow's danceIn their gaze you're caught in tranceBy day they bask in sunlit beamsIn slumber deep chasing dreamsOf mice that scamper in their playIn the realm of night and dayIn ancient times they were reveredIn pyramids their forms appearedGuardians of the secrets oldIn their eyes the stories toldIn alleyways and on the fenceTheir mystery makes perfect senseA creature both wild and tameNever twice quite the sameThey purr like the rolling seaA sound of peace and mysteryA lullaby for troubled heartsIn their presence warmth impartsWith agile leap and graceful boundThey traverse their hallowed groundIn every movement there's a poemIn every silence a hidden tomeIn winter's chill or summer's heatTheir resilience is quite a featAdapting with such ease and graceIn every season they find their placeSome say they have nine lives to liveIn each one love they freely giveTeachers of the art of beingIn their gaze a deeper seeingIn their eyes a galaxy spinsA universe where wonder beginsEach whisker a line of a verseIn their world no need for rehearseThey play with yarn in sheer delightIn their joy the world turns brightChasing shadows pouncing on lightIn their games a pure delightAt times they seem to ponder deepSecrets in their hearts they keepSages in a furry guiseWisdom old and worldly wise"""text=text.lower()
tokens=text.split(" ")
vocab=sorted(list(set(tokens)))
int2char= {(index+1): charforindex, charinenumerate(vocab)}
char2int= {char: (index+1) forindex, charinenumerate(vocab)}
encoded= [char2int[char] forcharintokens]
context_size=4train= [encoded[i:i+context_size] foriinrange(len(encoded)-context_size)]
targets= [encoded[i+context_size] foriinrange(len(encoded)-context_size)]
foriteminrange(len(train)):
print(f"{' '.join([int2char[c] forcintrain[item]])}{train[item]} -> {targets[item]}{int2char[targets[item]]}")
t=torch.linspace(0, 1, d_model).requires_grad_(False)
targets= [torch.sin(2*torch.pi*torch.tensor(target)*t) fortargetintargets]
model=MyModel(d_model)
optimizer=torch.optim.Adam(model.parameters(), lr=0.001)
criterion=nn.MSELoss()
forepochinrange(100):
foriinrange(len(train)):
y=model(train[i])
target=targets[i]
loss=criterion(y, target)
optimizer.zero_grad()
loss.backward()
optimizer.step()
print(f"Epoch {epoch} | Loss {loss.item()}")
whileTrue:
sentence=input("Enter a sentence: ")
foriinrange(300):
context=sentence.lower()
tokens=context.split(" ")
context=tokens[-context_size:]
encoded= [char2int[char] forcharincontext]
y=model(encoded)
fft=torch.fft.fft(y)
fft=torch.abs(fft)[:len(fft)//2]
prob=F.softmax(fft*0.5, dim=0)
prediction=torch.multinomial(prob, num_samples=1).item()
# print(prediction)# print(prediction, int2char[prediction])sentence+= (" "+int2char[prediction])
print(sentence)
The text was updated successfully, but these errors were encountered:
I did a small experiment after watching your tutorial and the tutorial by Brainxyz.
the idea is to convert each token (a word in my case) into a sin signal.
I take a context_length word token and I sum up their signal (I add shift to represent position)
then I let the neural network predict the next signal.
I used Fourier transform to get the predicted token which is the max frequency in may case.
Hope this code will help for something :)
The text was updated successfully, but these errors were encountered: