[ Home ] [ wiz / dep / hob / lounge / jp / meta / games / music ] [ all ] [  Rules ] [  FAQ ] [  Search /  History ] [  Textboard ] [  Wiki ]

/hob/ - Hobbies

Video game related hobbies go on /games/
Password (For file deletion.)

  [Go to bottom]   [Catalog]   [Return]   [Archive]

File: 1613143570126.jpg (515.77 KB, 1196x1178, 598:589, frog.jpg) ImgOps iqdb

 No.56871[View All]

55 posts and 9 image replies omitted. Click reply to view.


is CLRS algorithms or Sedgewick algorithms a better read if I did a course on it years ago but didn’t learn much?


Upgrade to Kite Pro


Has anyone here read through Algorithms by Cormen et. al.? I bought this book thinking that it will teach me how to write algorithms but I'm in chapter 4 and so far all it's talking about is theories about optimizing the running time. I understand the importance of this, but it should not be the first topic to be discussed in such a book. also, should i bother to do the exercises from this book?


i know how to use curl pretty well now, for making requests and such. i still dont know jack shit about how the receiving end works

my understanding is that servers are just regular computers that have a program running that is listening to a certain port for correctly formatted requests coming from other computers. i dont understand how you can make a program that listens for these kinds of requests though. basically, i think i want to know specifically how server programs are listening for requests. in a windows environment also

i tried looking for info but all the knowledge is above me


Well in Unix it is quite simple, I am sure in Windows it is much the same thing, too, because they all use the same underlying sockets technology, afaik.
The server process is run as what we call a daemon, that is a process that is detached from a terminal (so it's just the process with no front-end like a command line, gui, or anything), it opens a new socket, which is the standard implementation of all network operations. It is really, just like pipes and other IPC mechanisms, an object (ie. datastructure) in kernel space. Sockets are just that: a tool for IPC which is persistent, and provides a two-way communication channel, possibly reading and writing to a network device (it too a datastructure in the kernel).
Back to the server software, in order to service several requests concurrently, whenever it receives a request, it forks a child process in order to service that request, so it can keep on listening and servicing other requests. The functionality here is just as with any other process, the kernel has some chunk of data available for the program to read as input, process it and return some output. Sometimes the connection is persistent, sometimes it is readily dropped (I'm not sure how http does this, it is in principle a tcp connection, which would have to be explicitly terminated by either end, but it also can just serve a page and terminate, I guess implementations decide how long to keep a process listening for aditional requests, which is often the case (consider loading a page with several files, and having the user navigate the site.))
As for the rest of the process, it all happens on the network logic of the kernel, which encapsulates the details so as to run more or less as I just described. The server does not need to know about how the kernel multiplexes the network device into ports, or how tcp manages to keep data integrity, etc.


Finally getting the native c64 version of Turbo Macro Pro to cooperate with me. Using the 512k RAM expansion version cleared up issues where I couldn't get code to assemble if it included macros. I then struggled through the process of figuring out how to include files. They apparently have to exported to SEQ files before being included. Wrote some basic I/O routines like a print routine, a clear-screen routine, one that outputs a newline character, etc. Going to gradually make a connect 4 game. I even have 16-color text and a various petscii symbols so I can make a proper board with red and blue pieces if I so choose.


so the million dollar keyword is sockets. not understanding sockets seems like my problem as of now


If you can read C code you might want to read this one:


File: 1622037081927.png (2.35 KB, 267x195, 89:65, ClipboardImage.png) ImgOps iqdb

using that as a guide i made something, though it's not in c. i think i understand it somewhat now. was able to start up tcp on both server/client, make a listening socket and wait for client to connect and then receive the message it sends

to get it to working beyond my local network though and to use my public ip i think it will require port forwarding, and then i need to reserve my local ip to stop it changing since dhcp is enabled

this is very cool


Learning c64 programming makes me feel like I am insane or retarded (which I am). To be able call a machine language subroutine from BASIC that I wrote in an assembler, I have to assemble to disk, reboot the system, go into a machine language monitor, load the routine into memory, exit to BASIC and then load and/or write the BASIC program that calls the routine. If you try to load more than program into memory directly from BASIC, programs clobber each other. They don't load neatly into the memory locations you would expect them to load into.


load more than one program into memory directly*


i get the fun/puzzle aspect in building low level stuff, but that just seems kinda annoying


I think there's actually a "load-to" command in the assembler that can be used to put a routine/data in a specific location without rebooting. I learn something new all the time. There's also multiple banks of memory inside of the RAM expansion unit also but I don't know if they can be accessed from BASIC. One step at a time.


File: 1622126212429.jpg (2.14 MB, 1800x1800, 1:1, nitori_shinmyoumaru.jpg) ImgOps iqdb

>is CLRS algorithms or Sedgewick algorithms a better read if I did a course on it years ago but didn’t learn much?
CLRS is more academic, Sedgewick is more informal. If you've already done a course on it then CLRS will be the better read. You can preview both on libgen.

>Learning c64 programming makes me feel like I am insane or retarded (which I am). To be able call a machine language subroutine from BASIC that I wrote in an assembler, I have to assemble to disk, reboot the system, go into a machine language monitor, load the routine into memory, exit to BASIC and then load and/or write the BASIC program that calls the routine. If you try to load more than program into memory directly from BASIC, programs clobber each other. They don't load neatly into the memory locations you would expect them to load into.
You're thinking too hard about this. Remember that the basic load function is itself just an assembly routine, which is built on top of kernel routines from the c64 rom. The basic load function likely maintains its own pointer to where it thinks free memory is, like how malloc/sbrk does, which is where the ambiguity is being introduced. You can read the load disassembly here https://www.pagetable.com/c64ref/c64disasm/

How can you break out of this? If the basic load function calls your own load routine then you can do what you want. Alternatively, you could use the kernel routines like basic does. If you can get bytes into memory you don't need to reboot the machine, they can be immediately executed. Now, because you're dealing with a machine with expanded memory there's an additional annoyance, but all that means is you must flip some bits in the memory controller to access the page from banked memory you want. As far as the CPU is concerned there's only one address space limited by the physical width of the address bus.


are markov chains just a way to, for example when dealing with strings of characters, to select the next char based on the previous char according to the pair frequency


I'm realizing that the problem isn't the LOAD routine, it's the fact that BASIC frees memory the moment it thinks it is no longer needed such as when you modify the program stored at Start Of BASIC. To get a ml program to run at all from BASIC, the bare minimum requirement is to assemble a header of bytes at *=$801 which translates to

10 SYS 4096 (or wherever the ml program actually starts)

At this point, you can actually start it with a RUN or a SYS call. The moment you try to modify the one-line BASIC program to something like

10 FOR I=0 TO 10
20 SYS 4096
40 END

the routine stored at 4096 disappears. I think the magic may lie in moving various pointers around and/or changing the header to contain a different line number.

None of this really matters unless I want to write BASIC programs that call machine language routines. The C64 Programmer's Reference Guide might explain it.


Hmm. Without the BASIC header, LOAD puts an assembly language program at $801 regardless of whether or not it belongs there. With the header, the actual assembly code gets put in the correct location. However, once you make any modifications to the BASIC program, the machine code gets displaced. BASIC thinks everything between $801 and the end of the machine language is one big BASIC program. Moving the top-of-memory pointer does nothing to prevent code from getting moved.

It does seem like writing a short assembly program to load the real code into memory is necessary, possibly in conjunction with using the top-of-memory pointer to protect the code from being eaten by BASIC.


Learned many things about BASIC today such as how to create and delete files and write data to SEQ files. Made a short script that creates a table of pre-computed random bytes that can be included into an assembly program and a sprite editor. The table generator could be modified to generate tables of trig functions for instance. The sprite editor works by editing DATA statements in a BASIC listing. A nested FOR loop can be incredibly slow in a scripted language running on a CPU from the mid-seventies. I could maybe optimize by making the inner loop machine code. Going to make a screensaver by making differently-colored circles bounce around the screen randomly. Maybe I will try sprite multiplexing. Or try loading the sprite data from disk in machine language instead of just building it into the code.


I think I killed this thread but I will continue to blog. Have tried multiple C compilers for the c64 and they don't work. Files always end up getting corrupted somehow. Even with accurate disk emulation enabled. I'm using the c64 "maxi" btw which is pretty neat and generally works as intended but not the C compilers. Forth, on the other hand, works pretty well. The binaries that it builds are huge because it builds the compiler, forth kernel, assembler, text editor and various libraries into the binary by default. I will have to figure out how to trim things down. Durex forth is, according to its creator "50 times faster than BASIC" which puts it on the same level as C (about half as fast as assembly). It's nice to have a "high-level" language that doesn't poke along (pun intended) like a horse and buggy.


It's not you who killed the thread. Wizchan is dead. Simple as.

Bracing for sage-wizzie to scold me with his usual grumpy snark.


File: 1622760993756.jpg (35.05 KB, 854x530, 427:265, gendo.jpg) ImgOps iqdb

I've been reading your posts and will post eventually. I've been dabbling in assembly for a few months while studying comp sci. I have a c64 emulator set up but I haven't used it yet for anything other than trying out c64 software.


c64 programming is rewarding but also maddening. I got the C compiler to work by switching to PAL mode but it only works with accurate disk emulation which is painfully. To compile and link a program, you must insert the compiler disk, then the source disk, then the compiler disk again, then the object disk, then the compiler disk again, then the object disk, then the library disk then the binary disk. I might be able to eliminate a few steps by using the RAM Expansion unit and a single disk for souce/object/binary but I think it will fuck it up somehow. All this for a program that in the end will run at half the speed of an assembly program or slower because the computer doesn't have enough memory to optimize code.

Forth is interesting but also kind of stupid. A function that evaluates the expression "9a^2 - ba" is written "over 9 * swap - *" and it is essentially a sequence of operations on a stack. I can't figure out how to make my programs launch automatically which means that they have to be invoked through the interpreter which means I can't strip the interpreter out of the binary which means that my program has a debug console built into it that can't be removed.



painfully slow*


Hmm, I think I will put the c64 in the closet and use the table it sits on to draw or something instead. Better use of space and time. Antiquated technology is one thing but shit emulation is something else entirely. Poor investment of money. I get what I deserve buying an emulator box.


Does anyone here work as a programmer and do things outside of work, or are you doing it solely as a hobby?


I went to college to learn programming but I do not work. I squander my limited abilities on pet projects that rarely get completed.


I used to like programming for it's own sake, but now I see it as a tool to get a job done, that job is to automate some task. It might be fun to model some scenario, but ultimately it's a means to and end.


Welp. Turns out you can load machine language routines into their correct memory locations with LOAD "<program name>",8,1 as opposed to LOAD "<program name>",8. If you omit the 1, the program always begins at start of BASIC and ends at the end of the range it was assembled into.


I finished a masters degree about 3 years ago then did nothing


You can't bypass google captcha unless you pay a team of pajeets to solve them by hand, as far as I'm aware. Even then, it's tricky.

On old style captchas that were just typing words, you could try to clean it up, then put it through OCR, or more reliably, just save the image, then have a hivemind of people solve them for 1-2 cents each. If two or more pajeets give the same answer, you can be 99% sure it's correct.

But google captchas are more sophisticated, they make you select certain images and also require you to act like a real user, so any sudden, robotic movements of the mouse, doing things too fast or too slow, or just not having a real browser signature, immediately flags your IP. Also, if you use known VPN proxies or TOR, it makes their captchas more difficult.


it's doable without image recognition


kinda, you have to block/clear cookies and cache every time you do it, maybe change user-agent and block trackers. your browser is easily fingerprintable, although privacy.resistFingerprinting on firefox would mitigate that to some extent.


Well, I'm no expert by any means but you could take a look at this book automatetheboringstuff.com and after that learn how to control the selenium engine. Or you could also learn JavaScript and write some scripts for violentmonkey that run in the browser along with some extensions like (buster).


if what >>58437 said is true, just have addon buttons for clearing cookies, cache, and setting a new random user agent so you can click them. you can probably get away with just stuff like autohotkey or autoit to automate clicks and entering text.



Found this website I was looking for xplaining display servers. I know what I'll be doing all day today.


Now I'm in a course to learn programming, I hope that's not off topic because it isn't a hobby in my case.
I learnt quite a bit 10 years ago before I dropped out of university.
I've been keeping up with the course pretty well but I haven't done last week's assignment which had a deadline yesterday.
I wish it was in a physical location instead of online "school". They normally have a location but I was extremely unlucky and happened to start when it was online.
Now it will require a lot of discipline which is exactly what went wrong in the past.


File: 1626095917020.gif (194.09 KB, 500x600, 5:6, preview.gif) ImgOps iqdb

Lately I've been working on a flashcard program to learn Japanese after being disappointed with how big and slow anki was. It's a 500 line C program using SDL: https://github.com/wusho/flash

It's really made for loonix users but there's a precompiled version released for wangbows if any wiz wants to test it.


File: 1627222981564.png (10.85 KB, 600x700, 6:7, 2021-07-25.png) ImgOps iqdb

I made a new wangbows release that includes the Core 2k deck: https://github.com/wusho/flash/releases
You can use it either by changing your config or by running `flash -d data/core2k.txt'


File: 1627519685366.jpg (1.42 MB, 1811x2245, 1811:2245, 1627510351697.jpg) ImgOps iqdb

I have momentarily regained the clarity required to program. I probably have at most two weeks before I begin to deteriorate again.


When "Flowers for Algernon" is a documentary.


File: 1627842002943.jpg (131.62 KB, 1092x1039, 1092:1039, wizblock.jpg) ImgOps iqdb

My first completed C64 game. Uses some cool 6502 features like binary-coded decimal arithmetic for keeping score. The blocks were originally black and white but I found that I could easily make the blocks have color by switching to extended background color mode and inserting a few instructions here and there. I fixed an issue with the ball carving tunnels through the blocks by using raycasting in four directions. I could optimize the code to only perform collision detection when a ray intersects a section of screen that could potentially have a block in it with little modification.


Nice. Did you make your own bitmap font?


wow that’s pretty awesome wiz, I always wanted to do something like that, was it hard, how long did it take


I made the ball and paddle sprites and copied an edited version of the character ROM into RAM so that I could have graphics for the blocks (each block is just two text characters next to each other). You are limited to only 64 possible characters in extended color mode.

I spent weeks beforehand learning about graphics programming and input and just getting used to using an assembler. When I actually sat down to do it, the time taken was probably 4-5 days. It's not as hard as you might think if you already have prior CS knowledge.


File: 1627933387255.jpg (44.13 KB, 640x401, 640:401, pacman.jpg) ImgOps iqdb

Spent some time painstakingly recreating the maze from Pacman using PETSCII characters.


It has taken an embarrassing amount of time to make pacman move halfway close to the way he's supposed to move. He's stays centered in the passageways, doesn't clip through walls. Doesn't eat holes through walls, moves between 1 and 5/4 of a pixel per frame (you can have fractional move speeds without floating point, fixed point or anything other than integers). All of the stuff is programmed in aside from a high score screen, sound and the ghosts. There are people who can make a game like this in assembly in an afternoon which makes me die inside a little.


I'm getting into coding, using freecodecamp to learn, and I'm pretty new and I know this might sound stupid but I'm wondering what plagiarism means when it comes to coding, I know what it means when it comes to writing, but coding? Isn't it mostly problem solving? If you want to build a motor you have to use copper wires and magnets (or metal) whatever needs to be used to make the thing work. I suppose you are plagiarizing Edison or Tesla. I get it when it's blatantly stealing when it could've been done a different way, but if you're using the same programming language how do you not use the same stuff to make a similar or same product? I don't know am I being an idiot, do I just not understand coding enough to tell the difference, what's going on? Can some one explain.


depends who you ask

in the old days there used to be books filled with coding solutions and algorithms, and having bought the book you were allowed to just type it in and use their solutions

most people do that still in the form of copy/pasting stack overflow or other documentation sources, they modify things to fit their application

i dont think anyone has a problem with using snippets. it's mainly using entire files with minor changes that irks people now. or using open source stuff without releasing your project's code. weird license things


I see, thanks.


In a university setting it depends on the situation. If you're doing a project to implement an algorithm from scratch and you just copy paste code then it defeats the point of the exercise. In real world situations it depends on the licensing of the product you are copying; it can be just like writing where you are in some absurd situation such that you have to paraphrase. You are supposed to cite sources just like writing. But

[View All]
[Go to top] [Catalog] [Return][Post a Reply]
Delete Post [ ]
[ Home ] [ wiz / dep / hob / lounge / jp / meta / games / music ] [ all ] [  Rules ] [  FAQ ] [  Search /  History ] [  Textboard ] [  Wiki ]