Making a new programming language for education

Update…

Expression engine complete. Was a stupid mistake where I was treating the function as a expression start, while still tokenizing the opening (

So the problem WAS in the tokenizer after all. Finally, I can move on to actual full line processing/execution and variable assignments.

Though I also just belted out a couple test constants. PI and Deg2Rad.

Deg2Rad=PI/180. You multiply degrees by that to turn it into radians… though since I’m NOT going to make SIN/COS/Arctan use radians, I guess that one’s kinda pointless.

Still, it’s really cool to see it finally evaluate:


Enter expression or blank line to quit >64*sin(45*pi/180)+160
result: 205.25483400

From here should be smooth sailing, I actually consider that the hardest part of writing an interpreter. It even has error handling:


Enter expression or blank line to quit >160+64*sin(45*pi/180
processing: 160+64*sin(45*pi/180
Error in Expression: Closing Bracket Missing
  160+64*sin(45*pi/180
                     ^

Which that’s done at the tokenizer, so the actual interpreter part would never even get that code to try and run it. In the final version I plan for the EDITOR to not allow you to add such a line without fixing the error first.

I think people would hate that…but I would like it! (If that’s any consolation)

The language seems pretty easy to learn. I like the keymap idea - keeping all key actions in one place would be really easy to keep organized. You could maybe define different keymaps for different sections of the code?

I think using some sort of bracket, semi-colon, etc. would be useful instead of just using carriage/line-breaks. It would also be useful as a segway into other languages that the students of your code would recognize later on.

~TehYoyo

The old line numbered basic wouldn’t let you enter an invalid line, and would return:
SN ERROR?
or
Syntax Error
or even
??SNX

Depending on the flavor, so it’s not a new concept. With the old BASICS you had to re-enter the entire line; that sucked. In this case it will instead keep it in the edit box and put a arrow or highlight at the point where the error was encountered, instead of throwing it away.

Was also thinking on hotkey functions and/or autocomplete. Personally, I hate autocomplete, but many older systems (like the sinclairs or TRS-80 MC10) had basic commands printed on the keyboard, and you hit FN+the key, or state based just the key to enter the entire command. This was the only way to enter code on a Sinclair because with only 1k of RAM, all lines were stored as just byte sized tokens!

Notice keymap is inside the loop - it doesn’t set them up as events, it’s just a replacement for a bunch of nested if statements.

I’ve been ‘refining’ the syntax a bit as I start working on actual code lines and variables… this would be a more robust example:


  bufferVideo
  makeSprite player 32x32
  player.loadTiles "playerSprites.png"
  loadSound "thrust.wav",thrustSound
  
:menu
  player.hide
  clear
  stopSounds
  at 0,16
  writeCentered "Simple Game Demo"
  write
  writeCentered "Press <SPACE> to Start"
  writeCentered "or <Q> to quit"
  inputMode buffered
  renderFrame
  
:menuKeyLoop
  inputMap
    = ' '
      jump gamestart
    = 'q','Q'
      jump exitgame
  endInputMap
  jump :menuKeyLoop
  
:gameStart
  clear
  player.moveTo 0,0
  player.setMomentum 0,0
  player.setAngularGravity 180,10
  player.setDrag 1
  player.show
  inputMode unbuffered
  
:mainLoop
  renderFrame
  inputMap
    = "q"
      jump menu
    = "a",arrowLeft,numberPad4,digitalLeft
      player.addMomentumX -1
    = "d",arrorRight,numberPad6,digitalRight
      player.addMomentumX 1
    = "w",arrowUp,numberPad8,digitalUp
      player.addMomentumY -1
      player.setAnimationRow 2
      call thrust
    = "s",arrowDown,numberPad2,digitalDown
      player.addMomentumY 1
      player.setAnimationRow 2
      call thrust
  endInputMap
  jump mainLoop
  
:thrust
  if thrustSound.stopped
    thrustSound.play
  else
    thrustSound.sustain
  endIf
  return

:exitGame
  stopSounds
  clear
  writeCentered "Thanks for Playing"

I’ve been arguing with myself about that one too. ROM BASIC doesn’t have them, Python (the language the pi folks are apparently pimping somewhat to my surprise) and RUBY lacks them too. (while I dislike both languages, those are NOT part of why)

Working command per line there’s really no reason to have ending semicolons – also I’m thinking on using semi-colon as ROM BASIC did to separate strings, varaibles etc on print… where a comma or no semi-colon was a line-feed (in some versions)

test1=55
test2=60
print “test 1:”;test1,“test 2:”;test2

would output
test 1:55
test 2:60

It’s either that or adding something like “writeln” pascal style, or making them add a newline character…

I was even thinking on having semicolons be used as a ‘repeat last operation with different values’ delimiter, for things like:

color white
plot 2,3; 2,4; 4,4;

which would be the same as saying:
color white
plot 2,3
plot 2,4
plot 4,4

Still trying to figure out if I want to go that route, or if that’s too complex an idea. Would actually shrink the tokenized code and speed up execution of sections like that – but I could also have the tokenizer automatically condense them behind the scenes before feeding the bytecode to the interpreter.

Anyone use CECIL when learning programming at school?

Looking back it was rediculously simple, but the very basic concepts were there.

I did a bit of CECIL, then a bit of BASIC before doing COBOL ( eek ) at A level - what a language to learn programming with! Sigh.

Raspberry-Pi is going with Python - is that not simple enough and sophisticated enough?

Regards,

Mike

Actually, I’ve just had another look at Python - I don’t like it for 2 reasons already:

  1. its type sensitive! No-one should differentiate variable names by type, so why have it type sensitive?!

  2. using an equals sign for assignment - it is just plain wrong. a = a + 1 is incorrect. Clearly a does not equal a + 1 - the statement actually means assign the value of a + 1 to a. Pascal has an assignment operator: a := a + 1.

I withdraw my support for Python - use Pascal instead :slight_smile:

Regards,

Mike

About semicolons:

Huh. Maybe. I just think it’d get kind of confusing. Also, that’s a lot of whitespace - but I guess it doesn’t matter on a local level. I think if you have the enter key actively work like the end of an action, you’d have to have lines super short. Like, 10 characters per line.

~TehYoyo

That is my concern. Still playing with the idea; I’m looking at all sorts of different languages to cherry pick what I think works and eliminate what doesn’t.

Remember, given Composite video is one of my targets, 40 columns is screen width… and to be honest I’ve never liked long lines or wrapping of code; It’s something basic allowed for with multiple operations per line separated by colons. I’m ok with it on things like strings or arrays; but not with logic code.

Probably why I piss so hard on the folks who stuff all their CSS properties onto one line. Illegible error prone mess.

Typecasting, particularly strict typecasting, is common to most advanced programming languages. Non-typecast languages are actually pretty rare; the most noteworthy being PHP.

C, Java, Javascript, PHP, Perl – they all use = as the assignment operator, which is why such languages have == as an evaluator. In fact, Wirth family languages are the only one’s I’m aware of that use a different assignment operator.

Only the latter is a algebraic use of =, and that’s not entirely an accurate assessment either given it is an evaluation/conditional, NOT an expression. It’s a programmatic construct. Part of why I say conventional algebra and calculus may have outlived their usefulness.

Though for simple incrementing I do really prefer inc(a); a++; or a+=1; – but I’m NOT putting those in as they are ‘complex’ constructs and slightly cryptic.

… and I HATE cryptic programming languages; part of why I’m NOT a big fan of C syntax in the first place.

Besides, we all know C and Unix are a hoax.

I read your topic (about half actually) but I find newer programming language much easier to learn then ancient language like C/Basic/Fortran. Also, just like IT… it must change over time as well. For example, seeing my 18 months old daughter unlocking android/iphone (yes both) is a wonder to see. Even though all she did is swipe a screen or press few buttons to get to angry bird…it’s equivalent of programming. Each button means something and resulted in an action. What I’m saying is that there should be NO new programming language to learn programming… Existing programming language is more then good enough to learn… trust me… in 20~30 years or so…there will be a way of programming in Minority Report hand motion. Honestly, I believe 12 yrs old is capable of learning any programming languages.

Also, for beginner learning I want to emphasize on “reinventing the wheel”. I don’t believe in monorail or one way of learning programming languages. Just my 2 cents :wink:

Just because I don’t think it’s worth repeating something when it’s so well said by someone else… :wink:

It’s funny because while I like that article, the one he links to by Rachel Andrew about “stop soliving problems you don’t yet have” my typical response is “don’t make problems”; which seems to be what her approach is about…

But then I think about “don’t make problems” and remember the tale on folklore.org of Make a Mess, Clean it up… which reminds me of one of my sayings, “rip everything apart down to the bone, then build it back up, only way to learn, only way to make things better.”

Which starts to tread into the problem of philosophy vs. practice.

But that’s irrelevant.

I think it might be a good idea to make the language case-sensitive. At the very least, for variables.

~TehYoyo

I know this will shock everyone, but case sensitivity – in programming languages, in filesystems, in general – is one of my BIGGEST pet peeves in computing. Shocking isn’t it? Me having a pet peeve?

General George S. Patton Jr
I know I’m a prima donna, {expletive omitted} I’m proud of it, I {expletive omitted} admit it! That’s what I can’t stand about Monty, the little strutting prat won’t admit it!

Jokes aside, it’s one of those things I never even encountered in programming languages my first… decade of doing it! Admittedly, most microcomputers and many mainframes prior to 1982 didn’t even have the OPTION for lower case. (Apple II didn’t… VIC=20 didn’t, Trash-80’s didn’t… Atari 400/800 didn’t… well, unless you count inverse video as lower case on some of those) but Basic wasn’t case sensitive, machine language and most assemblers aren’t… (mov ax,20 or MOV AX,20 – no difference), Pascal isn’t… (I’m writing this project in a non-case sensitive language!)… It’s one of the things that alienated me from C and *nix from the start (alongside uselessly backwards editors like vi, uselessly cryptic CLI’s like BASH, etc, etc… Again, I’m a CP/M, OS/9, TRS-DOS, PRODOS, RSX-11 type of guy) – even when I was doing accounting apps for Xenix. It’s why I used Pascal and Foxpro under Xenix, Paradox on DOS, etc, etc… Clipper and dBase, which I spent a lot of time porting software from, wasn’t case sensitive either. It’s another of those things only back room *nix geeks ever thought was a good idea back in the day; one of those concepts that to be honest, if not for Linux, would (and probably should) have gone the way of the dodo the same time as DEC, WANG and WYSE.

Honestly it’s one of those needlessly complicated little bits that IMHO are all about making programming HARDER than it needs to be. You know, the entire purpose of C being to perpetuate the idea that programming is difficult?

Well, when I encounter rAndoM WinBlowS CapitalisIng and Spaces %20 Every %20 freAkingPlace (and backwards \'s too) my teeth grind and I consider it a sort of laziness brought on by a popular operating system.

My husband goes even further: he’s against the whole concept of text-transform: uppercase and similar, because he believes this is CSS changing the data and it shouldn’t be allowed to change the data. Because in the *nix world, T and t are two entirely different characters, mapped to two difference places. Now, I’m more lenient than that, but to me it does matter, it IS more powerful, and means people need to pay attention to those little things that matter (a PRO for any wannabe programmer, the ability to notice important little things) that there is a difference between case.

Then again I don’t personally tolerate having to work on files where someone added spaces and quotes and () and other silly things without using correct URI escapes, nor am I happy to see ^Ms everywhere in a document because someone’s system wants CR/LF everywhere. And again, I’m biased: I consider Windows to be an OS for play things like media and games. I consider *nix machines to be used for things like work, text processing, running a server and developing. So clearly I’m biased but if I had a kid and we couldn’t manage to shove C down her throat and moved to something simpler, it would still definitely be in the POSIX world where little things matter and shiny eye-candy of “oh well I knew you meant picture.jpg when you said Picture.JPG” relaxed standards that gave us HTML tag soup would be forbidden.

Then again the forbidden is always the most attractive fruit. Kid would prolly be sneaking off to friends’ houses to try out that Windows thing, lawlz.

Good lord our kid would be messed up. But I guess that goes without saying.

I don’t find it more powerful, and it’s one of the things about *nix that pissed me off from the first day I dealt with Xenix some 30 years ago. T and t in grammar mean the same thing, with capitalization only used to indicate a name or the start of a sentence; it does NOT change the actual meaning of the word, it only helps to convey it. It’s one of the leading causes of mistakes in code and is why languages like Pascal are NOT case sensitive; as the old joke goes, the compiler won’t let you shoot yourself in the foot.

How is it more powerful? Seriously, what real advantage does it give you besides making things more confusing? NEEDLESSLY and pointlessly confusing at that; though needlessly pointlessly cryptic and confusing seems to be what C and *nix were designed for from day one. It’s another one of those things that seems intentionally crafted to make programming and computing more difficult than it needs to be – which is also C and *Nix’s bailiwick.

Did I mention I’d sooner hand assemble 8k of x86 machine language than deal with 50 lines of C code?

That’s funny, it pisses me off when documents LACK proper line feeds and carriage returns, BECAUSE THEY ARE TWO DIFFERENT OPERATIONS! Using just one character for both always seemed more broken to me, but then I used to work with terminals and printers that actually obeyed such things. LF should ONLY move you down a line, WITHOUT moving the cursor horizontally. CR should ONLY move the cursor to column 0 WITHOUT moving the cursor horizontally. GOD FORBID we use the ASCII characters for what they were designed for. :confused:

While I only agree with half of that – since I find it very limited for development work with the half-assed needlessly convoluted editors (I STILL can’t find a text editor worth a flying purple fish for Linsux or other *nix’s – gEdit is about the closest I can get to my needs), useless for text due to the annoying kerning of freetype (though that’s a non-issue with monospace fonts, still makes word processing USELESS for me)… the only thing *nix excels at is being a server. Trying to use it as a desktop OS is shoving that square peg into the round hole.

Lands sake I can’t even get proper multiple displays to work in a useful fashion; at least not without crippling tools like blender by killing openGL.

… and much of the uselessness comes from the posix legacy of people too lazy to drag the tools into THIS century, freetype, and of course, X11 itself with the horrible fragmentation of WM’s. I cannot stand to use any *nix distro as a desktop for more than a day; I tried for a week once… couldn’t do it.

… and it’s not like the command line intimidates me given I was using Xenix and genuine AT&T Unix some 20-30 years ago; It’s just really sad to see ZERO progress out of it… but then I couldn’t stand Unix on the PDP-11, which is why I ripped it out for a useful non-crippled OS – RSX-11.

Which mirrors my experiences trying to use any *nix based OS as a desktop system; it’s basically crippleware OOB… enjoy headphone jacks on laptops not killing the speakers, multiple displays refusing to let you set a order or recognize edges (or even enable all your displays), enjoy your video card performing like a model from two generations ago, enjoy networking hell every time you try to wake from sleep (to the point you’re basically forced to reboot if you want working WPA2), enjoy WM’s that aren’t even as capable (apart from long filename support) as Windows 3.x… and of course the pointlessly and needlessly complex command line with the even more useless MAN pages.

As a desktop OS *nix remains crippleware in it’s “pure” form! This of course is why the two REAL *nix success stories of the past decade – Android and OSX – both hide as many of the *nixisms as they can get away with from the user, and threw most of the GNU toolchain in the trash; Specifically X11 implementations.

… and that right there is the type of pointlessly stupid BS that only someone dipping into the *nix kool-aid could possibly defend. It was stupid in 1972 when most computers didn’t even HAVE the option for lower case; it’s unforgivably stupid today… and is another of those *nix legacies that again, if it wasn’t for Linux probably would be dead and buried by now. Let’s be honest, without Linux most *nixisms would have gone the way of the dodo; hell – in the microcomputing world – the folks who won the computer revolution – things like that DID go the way of the dodo… but like a rapacious zombified screaming swamp sow it’s dragged it’s sorry backside back into the light.

You know the REAL computer revolution, right? The one that left the back room *nix geeks in the dust and reduced anyone that touched *nix into being irrelevant for the majority of business by the mid 90’s? … and now somehow it’s back.

More things improve, the more people want to undo the improvements; If this is the ‘future’, I want nothing to do with it… *nix got down on it’s knees in front of the proverbial equine in the 70’s, and hasn’t seen a major improvement in usability since… Case in point see people actually using vi – with it’s uselessly cryptic command set, needlessly complicated editing, and a macro system so complex most of it’s users spend more time creating macro’s than they do writing text with it. Sure, it kicked some serious tail compared to edlin, but **** sake you might as well be using Wordstar commands like it was still 1978! Enjoy your trip in the wayback machine with your boy Sherman.

English grammar and computer language, 2 different things. Otherwise the letter “t” would have one mapping to rule them all. It has at least 2.

Considering I’m a fairly stupid person, it says something when I state that case is not confusing for me. If it’s not confusing me, then it’s unlikely to be a confusing topic; on the other hand, people can’t seem to get it straight in their heads when to use a semi-colon in Javascript, opening a slew of bugs. You want to write a program, you need to strictly follow its syntax, and if the syntax is case-sensitive, then YOU need to adhere to that syntax.

Again, the differences between uppercase and lowercase are none of those things for me. And again, I’m supposedly a 92 IQ (what I measured when tested in school anyway).

More powerful: in at least two ways. I can separate named things by case, and regex can weed them back out again.

Now this one I could get behind, though in the unix world we left paper typewriters behind and it’s “new line”, period. Whether you start back at the left is completely dependent on your language (ltr or rtl).
But I could claim separating the down-one-line and back-to-beginning-of-line are

needlessly pointlessly cryptic

keeping in mind my IQ.

But yes, I can see the irony of it.

I liked gEdit well enough when I was slow and stupid in development, but when the time came that I needed a regex-based search-and-replace, gEdit didn’t have it (possibly has a plugin for it) and I left it behind with the other Notepad wannabes. Vim is powerful. What’d they say about its power level?? IT’S OVER 9000!

I manage. I can’t get around Windows to save my life. Partially because of the whole file system setup. Trees are kind of a simple concept to me. Whatever that is Windows uses is not. But also because I am simply not familiar.

Command line is as simple or as complicated as you make it. Man pages are just manuals. They tell you the options of a command and what arguments it expects. Not much different than reading how to use a new programming language.
Again, moron speaking here, and I get by with command-line. Nothing fancy, no in-line Perl scripts or funky bash stuff. Just useful things like ssh, ftp, sshfs, scp.

That they do. Grandma can point and click, anyone who needs a real tool can get to them.

zomg that’s awesome

Either because Windows couldn’t do it, or wanted you to pay through the nose for the kool-aid unix gave away for free. Lawlz. Or because everyone wanted more $$$ eye-candy$$$ and danced into Stevie Wonka’s Chinese iFactory.

Powerful things take time to learn. If it were easy, it would be Notepad. I have never created a macro, though I have made recordings (q*q command). Same for Emacs, created by the great toejam eater himself.

I am honoured to be compared to Professor Peabody.

Off Topic:

zomg looks like DreamWorks is making one! http://www.cartoonbrew.com/feature-film/downey-jr-voicing-mr-peabody-for-dreamworks.html I’m gonna cry because it has a good chance of suckage!

Nope - HS was BASIC/QBASIC (PC and Commodore 64/128) and then pascal. College was COBOL, PL/1 (pascals dirty rotton, evil illegitimate) cousin and C. Probably helped that I was college of business instead of science. Oh, I did do one assembly class - that was actually fun (for a limited period of time).

In someways it’s a shame COBOL’s going the way of the dodo bird. With the exception of the odd column rules (which harkened back to punch card days), it was the easiest for even the true numbskulls to learn.

Huh, missed this post…

Never heard of it.

Checking… pure OOP you say? So what killed Modula-3? Great… Objects are cool, they’re not the solution to every problem… Looking for example code… wow, that kinda looks like Prolog.

Speaking of dead end languages – anyone else ever use Prolog?

I never actually got to use COBOL to write programs, though I learned enough of it to port a decent amount of software to DiBOL… which was a bit nicer a language to go with the nicer RSX-11 OS. Unix with COBOL vs. RSX and DiBOL, I chose DiBOL every time.

Lacks bindings to the type of things (graphics and sound) that kids might actually want to DO with the language, and you’d need to learn so much just to get them to where they could use SDL/OpenGL/OpenAL for anything useful it’s basically making an entire library set for it. At which point it’s only a hop skip and jump to making an entire language with it integrated in instead of userland code… Userland code of things like handling image copies and fancy blit operations dragging it down to the performance of computers from twenty years ago.

Besides, I hate Python – NOT as much as I hate C, but close. I should like it, with the strict formatting rules and such… but it’s just ugly half-assed whiskey tango foxtrot land for me. It also has a number of complex concepts you have to get past BEFORE you can get it to actually do anything – It’s lego technix, when I’m aiming to make duplo. It’s a MacFarlane Toys Spawn figure when I’m trying to make playmobil.

I want to put into the language all the bits that made BASIC on machines like the BBC Micro, Apple II, C=64, Coco, Dragon, etc fun and interesting, while losing all the bits that made it useless for doing anything ‘commercial grade’. At the same time, I don’t want to dumb it down into being completely useless rubbish like Scratch, or too cute to actually learn anything useful from like SpaceChem.

Python doesn’t deliver that OOB. I’d probably end up writing so much code to even TRY to get Python to that point, my current project will have a faster completion time.

Current projection is late june if all goes well. Actually working up a website for it right now, hope to launch by friday.

IQ tests are so flawed… I never felt that my 186 was justified – and I’ve met some 200+ that I wouldn’t trust to think their way out of a paper bag… while some of the smartest people I know test well below 100. Not like 92 is drooling moron anyways; that’s well within the mean for average.

Though admittedly, the people I know who tested out at 200+ aren’t… functional. As in they can think themselves into indecision; needing people to handle the simplest of tasks for them like dressing or making meals. One guy I know locally reminds me of Monk; Nuerosis? In spades.

The laugh of it is, that terminal program you’re running BASH in to start up all your console programs, and all your console programs? Use escaped ANSI control codes to do what the EXISTING ASCII control characters were intended to do in terms of moving the cursor. Which of course is part of why they needed uselessly cryptic three letter or less commands back at 150 baud, the terminal protocol used in the *nix world takes so many more characters to do the same job there’s nothing left for data.

Over 9000? That’s impossible!

For me right now it’s notepad2 or nothing; Someone set us up the bomb.

With *nix they went as complicated as possible; but again, I learned CP/M, non Microsoft DOS’s and Pre-unix DEC OS’ long before I ever encounted a ‘real’ Unix… and when I did, the ONLY thing that impressed me was the better security attributes in filesystems – everything else felt like it was carefully crafted to make life difficult.

Hell, it made DROSSDOS look good. You might not have heard of DROSS-DOS… It was a joke OS; “User Hostile Software”, where if you forgot to append -PLEASE after every command, it would randomly delete a file.

Then we’re reading different materiels. I’ve never seen ANYTHING as useless in terms of instructional materiel as a *nix MAN page.

Actually not true, I’ve bought particle board furniture… With the instructions that read like Jimmy James, Macho Business Donkey Wrestler. Not exactly shocking this is the environment that produced the likes of the FSF with their snake oil 35k of contract law legalese masquerading as “freedom”. …but Jimmy has fear? A thousand times no.

Oh great, I mentioned DROSS-DOS… Now I’m tempted to make a BASBOL interpreter. Kitchen Table Software at it’s finest.

Oh yes, it was part of my course that I found interesting in how it was an ideal language for linguistics, artificial intelligence, tree structures, problem solving, etc. It was fun in trying to understand the repeats until then loop back… which I found very hard to grasp :frowning: