Meaning is dependent not only on the individual but also on the context, situation, and interpretive community.

Sept. 30, 2020, 9:53 p.m.

Professional Presentation Slides

By Maurice Ticas



Using Beamer will free your mind from having to worry too much about layout and document processing of your presentation slides. If you want to easily create a structured TeX document that organizes your thoughts to share publicly, then this LaTeX document class is very much worth a try.

read on >>>

Comment, bookmark, or share Professional Presentation Slides

There are 3 comments. No more than five will be allowed.

Oct. 9, 2020, 3:35 p.m. - Maurice Ticas

A great LaTeX Beamer tutorial by Michelle Krummel. She introduces us to a very nice Beamer theme named metropolis. To use it, just type in the LaTeX preamble \usetheme{metropolis}.

Jan. 11, 2021, 1:57 p.m. - MITTIGA642

Thank you!!1

Jan. 20, 2021, 1:32 a.m. - WilsonfuM

sulfur effects on body chloroquine phosphate tablets erectile products overcoming erectile dysfunction without drugs hydroxychloroquine side effects eye cure for covid 19 erectile desensitization effects of plaquenil tadalafil (tadalis-ajanta) order tadalafil do you need a prescription for zithromax buy generic zithromax

Sept. 3, 2020, 1:49 p.m.

Data Streams in Node

By Maurice Ticas




Web Development

How do you efficiently copy a large amount of image files to a Node server?

I've been working on a photo app for over a month now. I've named it fotoshare. I want the app to be highly concurrent while copying and renaming many image files. I'm starting with a directory of 1,385 media files with a total size of 12 GB. As with 2020 being a lost year, I've not taken many photos and have lost capturing many moments on photo this year. Nevertheless, my media directory has grown to its current size over the past three years and will only grow larger as time passes. I ve been thinking on how to best implement a way to import and manage my photo assets in Node.

There are many ways to read files in Node. There is the way of reading all at once into memory. There is the low-level way with read() and write() that both use file descriptors. And there is the way of streams. Streams allow us to take incremental chunks of data in memory, sequentially process these chunks if needed, and write out. We do not need to have the entire file in memory at the time we do the processing.

There are a few ways to write code that handle streams in Node. You can pipe a readable stream to a writable stream using the pipe() method. The piped approach takes care of memory-backed-up pressure within the internal writable stream buffer. David Flanagan's Javascript: The Definitive Guide defines the term backpressure in this context as a "message from the [writable] stream that you have written data more quickly than it can be handled". A benefit to using pipes with streams is that we don't have to handle this back pressure in our code. The pipe implementation takes care of the backpressure for us.

I started writing the code for fotoshare using Node version 10, but a change to Node version 12 or higher is needed if the async-for-await construct is desired to handle streams. If you need to change the version of Node, then use the Node Version Manager to switch Node versions.

This async-for-await approach allows you to write code that looks synchronous. It's a promises approach that uses readable streams as asyncrhonous iterators to asynchronously handle streams. The benefit to using async-for-await is easy readability of the code. Backpressure can be easily handled in your code using this approach to streams. This approach is more suitable when you need to process chunks of the readable stream data in some specific way.

As is the case with many Node APIs, using streams is heavily event-based. This is especially more evident when using any of the two modes for handling readable streams. A readable stream can be either in pause mode or flow mode whereby each mode has its own API for stream handling.

The use of streams enable efficiently high input-output throughput. Samer Buna gives an excellent concrete example of the efficiency achieved when using streams to write a server that serves large static files. You won't be disappointed to read his freeCodeCamp contribution titled Node.js Streams: Everything you need to know.

read on >>>

Bookmark, or share Data Streams in Node

There are 0 comments. No more comments are allowed.

July 10, 2020, 12:08 p.m.

Web Development

By Maurice Ticas



Fast forward to today and we have a very large JavaScript community of web developers. I'll describe my journey into this web development community and share what I find to be the good tools to get the job done.

read on >>>

Comment, bookmark, or share Web Development

There are 0 comments. No more than five will be allowed.

2020: Jan Feb Mar Apr May Jun Jul Aug Sep Oct Nov Dec
2019: Jan Feb Mar Apr May Jun Jul Aug Sep Oct Nov Dec
2018: Jan Feb Mar Apr May Jun Jul Aug Sep Oct Nov Dec
2017: Jan Feb Mar Apr May Jun Jul Aug Sep Oct Nov Dec
2016: Jan Feb Mar Apr May Jun Jul Aug Sep Oct Nov Dec
2015: Jan Feb Mar Apr May Jun Jul Aug Sep Oct Nov Dec