Skip to content

Commit ad84cf6

Browse files
committed
add changes 〰️
1 parent f590f0c commit ad84cf6

File tree

3 files changed

+5
-4
lines changed

3 files changed

+5
-4
lines changed

chapter11/chapter11.md

Lines changed: 1 addition & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -651,12 +651,10 @@ In this way, we have the following trafficking:
651651
Caching with Varnish
652652
====================
653653

654-
The last piece of the production deployment puzzle is setting up caching using [Varnish Cache](https://www.varnish-cache.org/) (https://www.varnish-cache.org). This step is optional for Node.js deploys, but, like an Nginx setup, it’s also recommended, especially for systems that expect to handle large loads with the minimum resources consumed.
654+
The last piece of the production deployment puzzle is setting up caching using [Varnish Cache](https://www.varnish-cache.org) (https://www.varnish-cache.org). This step is optional for Node.js deploys, but, like an Nginx setup, it’s also recommended, especially for systems that expect to handle large loads with the minimum resources consumed.
655655

656656
The idea is that Varnish allows us to cache requests and serve them later from the cache without hitting Nginx and/or Node.js servers. This avoids the overhead of processing the same requests over and over again. In other words, the more identical requests the server has coming, the better Varnish’s optimization.
657657

658-
Here's a nice [Varnish Cache video](http://youtu.be/x7t2Sp174eI) (http://youtu.be/x7t2Sp174eI) that does a good job at summarizing the tool in just less than three minutes.
659-
660658
Let’s use `yum` again, this time to install Varnish dependencies on CentOS:
661659

662660
$ yum install -y gcc make automake autoconf libtool ncurses-devel libxslt groff pcre-devel pckgconfig libedit libedit-devel

chapter8/chapter8.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -540,7 +540,7 @@ You might wonder: Why spend time on TDD in the chapter about REST APIs? The answ
540540

541541
However, this is not the whole story. TDD is great when it comes to refactoring. The next section is spent changing from Express.js to Hapi. And after we're done, we can rest assured, by running the same tests, that the functionality isn't broken or changed.
542542

543-
# Refactoring: Hapi RESP API Server
543+
# Refactoring: Hapi REST API Server
544544

545545
[Hapi](http://spumko.github.io) (<http://spumko.github.io>) (npm (<https://www.npmjs.org/package/hapi>) and GitHub (<https://github.com/hapijs/hapi>)) is an enterprise-grade framework. It&#39;s more complex and feature rich than Express.js, and it&#39;s easier to develop in large teams (<http://hueniverse.com/2012/12/hapi-a-prologue>). Hapi was started by (and used at) Walmart which is a huge e-commerce website. So Hapi has been battle-tested at a YUGE scale.
546546

chapter9/chapter9.md

Lines changed: 3 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -100,6 +100,9 @@ To make sure you don&#39;t miss anything, here&#39;s the full source code of `ch
100100
ws.onopen = function(event) {
101101
ws.send('front-end message: ABC');
102102
};
103+
ws.onerror = function(event) {
104+
console.log('server error message: ', event.data);
105+
};
103106
ws.onmessage = function(event) {
104107
console.log('server message: ', event.data);
105108
};

0 commit comments

Comments
 (0)