TensorFlow Multilayer perceptron


Re: TensorFlow Multilayer perceptron

Postby Antonio Linares » Fri Aug 04, 2017 5:31 am

Classifying Text with Neural Networks and TensorFlow

https://medium.com/@Synced/big-picture-machine-learning-classifying-text-with-neural-networks-and-tensorflow-da3358625601

First, create an index for each word. Then, create a matrix for each text, in which the values are 1 if a word is in the text and 0 otherwise


https://github.com/dmesquita/understanding_tensorflow_nn
regards, saludos

Antonio Linares
www.fivetechsoft.com
User avatar
Antonio Linares
Site Admin
 
Posts: 42099
Joined: Thu Oct 06, 2005 5:47 pm
Location: Spain

Re: TensorFlow Multilayer perceptron

Postby Antonio Linares » Fri Aug 04, 2017 8:22 am

Creating a bitmap from a text:

https://github.com/dmesquita/understanding_tensorflow_nn

Turned into Harbour code:
Code: Select all  Expand view
#include "FiveWin.ch"

function Main()

   local hVocabulary := hb_Hash(), hWordToIndex
   local cText := "Hi from Brazil"
   local cWord, aMatrix

   for each cWord in TextSplit( cText )
      if hb_HHasKey( hVocabulary, Lower( cWord ) )
         hVocabulary[ Lower( cWord ) ] += 1
      else
         hVocabulary[ Lower( cWord ) ] = 1
      endif
   next  

   XBrowser( hVocabulary )
   XBrowser( hWordToIndex := WordToIndex( hVocabulary ) )

   aMatrix = Array( Len( hVocabulary ) )
   AEval( aMatrix, { | n, i | aMatrix[ i ] := 0 } )

   for each cWord in TextSplit( cText )
      aMatrix[ hWordToIndex[ Lower( cWord ) ] ] += 1
   next

   XBrowser( aMatrix )

return nil

function TextSplit( cText )

   local n, aTokens := {}
   
   for n = 1 to NumToken( cText )
      AAdd( aTokens, Token( cText,, n ) )
   next
   
return aTokens
 
function WordToIndex( hVocabulary )

   local hWordToIndex := hb_Hash()
   local n
   
   for n = 1 to Len( hVocabulary )
      hWordToIndex[ hb_HKeyAt( hVocabulary, n ) ] = n
   next  

return hWordToIndex
regards, saludos

Antonio Linares
www.fivetechsoft.com
User avatar
Antonio Linares
Site Admin
 
Posts: 42099
Joined: Thu Oct 06, 2005 5:47 pm
Location: Spain

Re: TensorFlow Multilayer perceptron

Postby Antonio Linares » Fri Aug 04, 2017 8:28 am

Meet the Robot Writing ‘Friends’ Sequels

After I read this:
http://www.thedailybeast.com/meet-the-robot-writing-friends-sequels

I got quite curious to understand how he did it:
“It works by predicting the next letter to follow a given sequence of letters, and the predictions are determined by what it learned about language from the Friends dialogue provided,”


This seems different to what I previously posted on this thread, but it seems clear that words (and text) must be turned into a bitmap so the neuronal network can process them.

Imagine a neuronal network learning from already existing code and writting the next code for you ;-)

I appreciate if you share your ideas about it :-)
regards, saludos

Antonio Linares
www.fivetechsoft.com
User avatar
Antonio Linares
Site Admin
 
Posts: 42099
Joined: Thu Oct 06, 2005 5:47 pm
Location: Spain


Return to Off Topic / Otros temas

Who is online

Users browsing this forum: No registered users and 2 guests

cron