Compare commits

...

349 Commits

Author SHA1 Message Date
Gareth Jones
a136404021 bumping the version 2013-09-24 07:33:50 +10:00
Gareth Jones
d1c6ad6f39 extra debug for failing module load 2013-09-24 07:33:33 +10:00
Gareth Jones
aad2bb1c2e improved test coverage 2013-09-16 07:59:57 +10:00
Gareth Jones
5c13469bf6 added section on peer dependencies 2013-09-15 14:46:47 +10:00
Gareth Jones
491c2709e7 changed the way appenders are loaded, so that they don't need to include log4js as a direct dependency 2013-09-15 14:37:01 +10:00
Gareth Jones
245b2e3b1a improved the readme 2013-09-13 08:17:22 +10:00
Gareth Jones
6dabcf5ee5 added to-dos to changes list 2013-09-12 07:23:11 +10:00
Gareth Jones
3142497e98 bumped streamroller version 2013-08-29 22:36:05 +10:00
Gareth Jones
09d4300875 removed async, semver, readable-stream deps 2013-08-29 22:23:32 +10:00
Gareth Jones
80f8f5a795 added streams, date-format changes 2013-08-29 22:18:34 +10:00
Gareth Jones
72c4fb48db extracted streams, date-format into separate modules 2013-08-29 22:15:50 +10:00
Gareth Jones
5e144e4004 fixing some lint issues 2013-08-29 16:56:40 +10:00
Gareth Jones
46ad57b4e0 all tests converted to mocha 2013-08-29 08:49:42 +10:00
Gareth Jones
fe1f1228ca removed underscore from dev dependencies 2013-08-29 08:42:22 +10:00
Gareth Jones
d43d49d83d converted date file appender tests to mocha 2013-08-27 13:44:33 +10:00
Gareth Jones
3312724d7d migrated file appender tests to mocha 2013-08-26 22:49:12 +10:00
Gareth Jones
045b0dda2b renamed categoryName -> category 2013-08-26 22:48:50 +10:00
Gareth Jones
b6dc0b9557 migrated layout tests to mocha 2013-08-26 21:49:59 +10:00
Gareth Jones
7c0cfbdcfd converted to mocha tests 2013-08-26 16:51:29 +10:00
Gareth Jones
25e9835521 forgot to remove old vows tests 2013-08-26 15:39:44 +10:00
Gareth Jones
be1a9ca411 migrated logLevelFilter tests to mocha, changed filter api slightly 2013-08-26 15:38:40 +10:00
Gareth Jones
b2569c6d9d removed log-abspath 2013-08-26 07:58:45 +10:00
Gareth Jones
04d0113224 updated changelog 2013-08-25 12:05:02 +10:00
Gareth Jones
ac4fd2a7fc migrated tests to mocha 2013-08-25 12:04:49 +10:00
Gareth Jones
50074842ad replaced my debug lib with standard one 2013-08-25 11:55:26 +10:00
Gareth Jones
5a2771cfed moved nolog test to log4js-connect 2013-08-24 20:51:25 +10:00
Gareth Jones
3b4a30587a removed unneeded tests 2013-08-24 20:49:08 +10:00
Gareth Jones
50a8164b4b keeping track of changes in the new version 2013-08-24 20:46:36 +10:00
Gareth Jones
eabcaf8aef moved cluster support into core, removed clustered appender, multiprocess appender 2013-08-24 20:46:10 +10:00
Gareth Jones
d8cf8cb2dc removed category filter 2013-08-23 08:43:19 +10:00
Gareth Jones
9afbbb580e proven that category filter no longer needed 2013-08-23 08:42:51 +10:00
Gareth Jones
6c09a6fb71 fixed trailing comma 2013-08-23 08:42:26 +10:00
Gareth Jones
5f68db41b4 merge from master 2013-08-22 15:51:08 +10:00
Gareth Jones
731e217505 removed dequeue from dependencies 2013-08-22 15:46:42 +10:00
Gareth Jones
3018a49bde 0.6.8 2013-08-22 11:52:25 +10:00
Gareth Jones
4e6e51e9fa test config file 2013-08-22 11:47:06 +10:00
Gareth Jones
7f38837f9b removed duplicated tests 2013-08-22 11:46:55 +10:00
Gareth Jones
b70e2e6220 setGlobalLogLevel removed, so no need for this test 2013-08-22 08:48:46 +10:00
Gareth Jones
834e084e5a setLevel removed, so no need for this test 2013-08-22 08:47:29 +10:00
Gareth Jones
270ba0fcc6 added initial configuration 2013-08-22 08:44:36 +10:00
Gareth Jones
bdb56e4256 moved smtp appender to log4js-smtp 2013-08-22 08:19:51 +10:00
Gareth Jones
b12a39ac79 moved hookio appender to log4js-hookio 2013-08-22 08:14:44 +10:00
Gareth Jones
c3d3a8c363 moved gelf appender to log4js-gelf 2013-08-22 08:11:48 +10:00
Gareth Jones
7bad76d8ec no longer needed 2013-08-22 08:03:22 +10:00
Gareth Jones
a0a5a3aa99 moved connect-logger to log4js-connect 2013-08-22 08:01:57 +10:00
Gareth Jones
a5bb94a048 Merge pull request #152 from fb55/patch-1
Browserify support
2013-08-20 15:52:32 -07:00
Gareth Jones
631ca75e57 no longer needed - log4js now throws exceptions instead of accepting invalid config 2013-08-21 08:13:50 +10:00
Gareth Jones
ab1c81a61c tests covered by test/log4js-test.js 2013-08-21 08:07:47 +10:00
Gareth Jones
a0d0373480 convert level string to Level 2013-08-21 08:05:28 +10:00
Gareth Jones
eb875b6d98 mocha tests for new log4js 2013-08-21 08:04:26 +10:00
Gareth Jones
a8679aced1 simplified levels a bit, converted tests to mocha 2013-08-21 08:02:37 +10:00
Felix Böhm
7a1a895e46 browserify support 2013-08-20 18:48:27 +02:00
Gareth Jones
48dc22eb63 Merge pull request #150 from wood1986/master
layouts supports hostname and ISO8601_WITH_TZ_OFFSET_FORMAT
2013-08-20 03:50:36 -07:00
wood1986
7888381991 Update layouts.js 2013-08-18 01:43:48 +08:00
wood1986
cd286fa25f Update layouts-test.js 2013-08-18 01:39:37 +08:00
wood1986
6df4753822 Update layouts.js 2013-08-18 01:36:07 +08:00
wood1986
613474eb44 Update layouts-test.js 2013-08-15 22:45:56 +08:00
wood1986
112246dd55 Update layouts-test.js 2013-08-15 22:39:59 +08:00
wood1986
069ed31759 Update layouts-test.js 2013-08-15 22:37:01 +08:00
wood1986
9e72189574 Update date_format-test.js 2013-08-15 22:30:57 +08:00
wood1986
5a167d853a Update date_format-test.js 2013-08-15 22:29:52 +08:00
wood1986
5755faa7bb Update layouts-test.js 2013-08-15 22:29:36 +08:00
wood1986
1ed026a8d9 Update fileAppender-test.js 2013-08-14 17:35:47 +08:00
wood1986
2d177d517b Update date_format.js 2013-08-13 23:04:52 +08:00
wood1986
21aebbde33 Update layouts.js 2013-08-13 23:04:11 +08:00
Gareth Jones
49892f35d3 Merge pull request #149 from mkielar/master
Clustered appender for log4js.
2013-08-08 18:04:18 -07:00
Marcin Kielar
61beac28d3 Clustered appender for log4js.
+ lib/appenders/clustered.js
+ test/clusteredAppender-test.js

Instead os using sockets (like multiprocess) or dead and unmaintained hook.io, Clustered appender
uses process.send(message) / worker.on('message', callback) mechanisms for transporting data
between worker processes and master logger.

Master logger takes an "appenders" array of actual appenders that are triggered when worker appenders send some data.
This guarantees sequential writes to appenders, so the log messages are not mixed in single lines of log.
2013-08-09 00:04:25 +02:00
Gareth Jones
c60d629608 added mocha, simplified logger by removing levels and making immutable 2013-08-08 08:56:09 +10:00
Gareth Jones
8ad1cd67e2 formatting fixes, unnecessary code removed 2013-08-05 11:40:59 +10:00
Gareth Jones
c67ab855bb Merge branch 'master' of https://github.com/nomiddlename/log4js-node 2013-08-05 11:33:23 +10:00
Gareth Jones
4905761f60 Merge pull request #119 from UniversityofWarwick/category-filter
Category excluding filter.
2013-08-04 18:25:32 -07:00
Gareth Jones
9897dcbc93 trying out weak references, don't think they're going to help 2013-08-05 11:19:53 +10:00
Gareth Jones
9c510f7705 added weak references dep 2013-08-05 07:51:21 +10:00
Gareth Jones
5bd7ce3ab9 working, except for tests which expect log levels to persist across getLogger calls 2013-08-02 15:12:04 +10:00
Gareth Jones
1e17f88ded 0.6.7 2013-08-02 11:38:34 +10:00
Gareth Jones
3b55aefe6f changed logger to not use events. everything is broken 2013-08-02 11:36:05 +10:00
Gareth Jones
d25e1abd48 Merge pull request #142 from crisply/master
Allows use of Console Appender when using with node-webkit
2013-07-14 18:32:26 -07:00
Lex
dde2e69948 Getting console appender to work with node-webkit 2013-07-10 05:07:28 -07:00
Gareth Jones
351a912a86 simplified the reload config code a little, moved the tests into their own file, improved coverage 2013-07-09 09:24:11 +10:00
Gareth Jones
c5fd75dac3 removed check on undefined configState.filename - should not happen, and is covered by the statSync anyway 2013-07-09 08:01:41 +10:00
Gareth Jones
4dd5989d27 Merge branch 'master' of https://github.com/nomiddlename/log4js-node
Conflicts:
	test/gelfAppender-test.js
2013-07-08 15:24:29 +10:00
Gareth Jones
46721465a1 Merge pull request #140 from karlvlam/master
Add custom field support to GELF appender
2013-07-07 16:17:23 -07:00
Gareth Jones
76ff7aa5fa improved coverage of date format 2013-07-08 08:51:42 +10:00
Gareth Jones
be5fa838be improved coverage of hookio appender 2013-07-08 08:46:11 +10:00
Gareth Jones
a86bed975c improved coverage of lib/log4js.js 2013-07-08 08:18:48 +10:00
Karl Lam
baaebef2ed GELF appender - test case covers custom fields, remove unused
console.log
2013-07-05 15:28:10 +08:00
Karl Lam
837d007de3 GELF appender can add customFields to config for every message 2013-07-05 11:23:59 +08:00
Karl Lam
be754f0c0e GELF appender can add custom fields 2013-07-05 10:54:31 +08:00
Gareth Jones
946b216a79 improved coverage of rolling file stream 2013-07-05 08:36:42 +10:00
Gareth Jones
508dbdadf8 improved coverage of gelf appender 2013-07-05 08:04:16 +10:00
Gareth Jones
2e7f6e5a66 improved coverage of logger 2013-07-01 08:24:29 +10:00
Gareth Jones
cbadb5fa19 improved coverage of multiprocess appender 2013-07-01 08:24:06 +10:00
Gareth Jones
c258470cda improved coverage of file appenders 2013-06-28 08:44:54 +10:00
Gareth Jones
2b070e5470 Fixed a problem when tests run in node 0.8 2013-06-28 07:55:25 +10:00
Gareth Jones
4cd546e8b3 improved coverage of baserollingfilestream 2013-06-27 08:46:18 +10:00
Gareth Jones
0e5da1d361 moved debug fn out to own module, added tests 2013-06-24 08:51:10 +10:00
Gareth Jones
fc7f686f65 improved coverage for console appender 2013-06-18 08:47:32 +10:00
Gareth Jones
4a8f0580de improved coverage for connect-logger 2013-06-18 08:47:18 +10:00
Gareth Jones
f50fab2b86 improved coverage for connect logger 2013-06-17 16:01:22 +10:00
Gareth Jones
f1c0767ca3 improved coverage 2013-06-17 16:01:03 +10:00
Gareth Jones
652888944b improved coverage for date_format 2013-06-17 16:00:42 +10:00
Gareth Jones
efc4e36317 improved coverage for layouts 2013-06-14 08:13:16 +10:00
Gareth Jones
d2f30b473f added test to improve levels coverage 2013-06-14 07:28:55 +10:00
Gareth Jones
fa179ecba2 added a delay to dateFile test, to let the filesystem catch up 2013-06-06 08:00:34 +10:00
Gareth Jones
dd25d30228 rolled back my clever map+join, because it broke the tests 2013-06-06 07:53:22 +10:00
Gareth Jones
11fe5bde5f increased test coverage for smtp appender 2013-06-05 18:30:11 +10:00
Gareth Jones
41ddf5eea7 merged util.format branch (fixes a lint error and simplifies the code) 2013-06-05 08:52:07 +10:00
Gareth Jones
81fa9c3568 removed unnecessary argument to createNoLogCondition 2013-06-05 08:38:39 +10:00
Gareth Jones
7ca517b5ed simplified createNoLogCondition 2013-06-05 08:37:27 +10:00
Gareth Jones
6368de1094 refactored pattern layout 2013-06-05 08:02:10 +10:00
Gareth Jones
94dbd22c71 reduced complex function to smaller ones 2013-06-04 08:37:36 +10:00
Gareth Jones
0a2a6c0769 don't create functions in a loop 2013-06-04 08:32:35 +10:00
Gareth Jones
5d6f00eda4 fixed all lint errors except ones which require refactoring of code 2013-06-04 08:17:36 +10:00
Gareth Jones
f998d7e81a more linting 2013-05-30 08:45:15 +10:00
Gareth Jones
46ae1a586d more linting 2013-05-30 08:26:26 +10:00
Gareth Jones
516320c79a more linting 2013-05-30 08:26:03 +10:00
Gareth Jones
40ec9e98e4 more linting 2013-05-30 08:00:04 +10:00
Gareth Jones
cc2e94cf11 more linting 2013-05-30 07:58:09 +10:00
Gareth Jones
2de838bc76 more linting 2013-05-30 07:56:28 +10:00
Gareth Jones
87dc7cf5aa more linting 2013-05-30 07:54:42 +10:00
Gareth Jones
913c748ee0 more linting 2013-05-29 08:42:09 +10:00
Gareth Jones
def0e8e371 more linting 2013-05-29 08:35:40 +10:00
Gareth Jones
20f80ff775 more linting 2013-05-29 08:29:30 +10:00
Gareth Jones
f24db59523 more linting 2013-05-29 08:28:35 +10:00
Gareth Jones
07869b915f more linting 2013-05-27 08:17:32 +10:00
Gareth Jones
2cd27e4293 more linting 2013-05-27 08:15:57 +10:00
Gareth Jones
3d11cbc0ad more linting 2013-05-27 08:14:51 +10:00
Gareth Jones
e5dba219d1 more linting 2013-05-27 08:11:24 +10:00
Gareth Jones
9853e13429 more linting 2013-05-27 08:01:00 +10:00
Gareth Jones
4fd138f87d more linting 2013-05-27 07:48:29 +10:00
Gareth Jones
1ad4977aec more linting 2013-05-27 07:44:59 +10:00
Gareth Jones
7cb7e6df72 more linting 2013-05-27 07:41:16 +10:00
Gareth Jones
2192a094b6 more linting 2013-05-26 17:21:39 +10:00
Gareth Jones
6a9441d261 more linting 2013-05-26 17:15:10 +10:00
Gareth Jones
50b676dec5 more linting 2013-05-26 16:51:46 +10:00
Gareth Jones
8b3c036245 more linting 2013-05-26 16:41:31 +10:00
Gareth Jones
b356dec318 Getting my lint on (via bob) 2013-05-25 14:00:06 +10:00
Gareth Jones
8383dfc4f4 0.6.6 2013-05-25 13:10:46 +10:00
Gareth Jones
4e8fb26099 Missed out the smtp test 2013-05-25 13:08:43 +10:00
Gareth Jones
8492519e3b Fixing issue #137 2013-05-25 13:04:48 +10:00
Gareth Jones
fdc9d253c9 0.6.5 2013-05-16 16:57:25 +10:00
Gareth Jones
18e21ca473 Merge branch 'master' of https://github.com/nomiddlename/log4js-node 2013-05-16 16:55:47 +10:00
Gareth Jones
ab8c7ed89d Merge pull request #136 from issacg/dontalwaysrename-bug
Dontalwaysrename bug
2013-05-15 23:52:57 -07:00
Gareth Jones
aa4f7c071b Merge pull request #135 from jmav/master
auto level detection from @jmav
2013-05-15 23:52:27 -07:00
Issac Goldstand
dc632f4705 Fixes bug introduced in github issue #132 where file rolling needs to be handled differently for alwaysIncludePattern streams 2013-05-11 23:01:28 +03:00
Jure Mav
ac6284add1 Added automatic level detection to connect-logger, depends on http status response.
Update of connect logger example code, compatible with express 3.x
2013-05-11 16:17:23 +02:00
Issac Goldstand
2da01cc611 Fixes bug introduced in github issue #132 where renaming a file to itself can cause an unhandled error 2013-05-09 13:09:59 +03:00
Gareth Jones
ad8229145e Merge pull request #133 from issacg/baseFileRollingStream-bug
Fixes bug in detecting empty options (see issue #132 on github)
2013-05-08 02:24:02 -07:00
Issac Goldstand
8c12c948d9 Fixes bug in detecting empty options (see issue #132 on github) 2013-05-08 12:05:32 +03:00
Gareth Jones
af6ae7af98 new version for alwaysIncludePattern 2013-05-05 14:01:40 +10:00
Gareth Jones
936ad4da8e fixed tests broken by alwaysIncludePattern 2013-05-05 13:44:01 +10:00
Gareth Jones
097ae3d7f1 Merge branch 'alwaysIncludePattern' of https://github.com/issacg/log4js-node into isaacg-alwaysIncludePattern 2013-05-04 16:10:02 +10:00
Issac Goldstand
04de4ed8d3 fix OS-specific endline mucking test results (:-O not everyone uses linux?!?!) 2013-05-03 11:14:28 +03:00
Issac Goldstand
29b02921b6 add option alwaysIncludePattern to dateTime appender to always use the filename with the pattern included when logging 2013-05-02 14:56:33 +03:00
Gareth Jones
48ed5d1222 Removed the warning about node 0.10 2013-04-11 22:34:49 +10:00
Gareth Jones
7844b0d2e4 0.6.3 2013-04-11 22:29:13 +10:00
Gareth Jones
8b49ba9f3d added node 0.8 to travis config and package.json 2013-04-11 21:49:08 +10:00
Gareth Jones
ed7462885f backporting new streams to node 0.8 for issue #129 2013-04-11 21:45:16 +10:00
Gareth Jones
36c5175a55 0.6.2 2013-04-02 12:02:47 +11:00
Gareth Jones
22160f90b3 fixed the multiprocess tests 2013-04-02 11:59:45 +11:00
Gareth Jones
73437ecb40 Merge branch 'master' of https://github.com/dsn/log4js-node into dsn-master 2013-04-02 11:34:25 +11:00
Gareth Jones
107e33c0d1 merged in change from @vojtajina for pull request #128 2013-04-02 10:18:25 +11:00
Gareth Jones
6352632fb2 fix version of node supported 2013-04-02 10:02:48 +11:00
Gareth Jones
0544342e9f Merge pull request #128 from Dignifiedquire/master-engine
Fix node engine in package.json
2013-04-01 15:42:41 -07:00
Friedel Ziegelmayer
1d1153d32f Fix node engine in package.json 2013-04-01 23:00:26 +02:00
Gary Steven
e58cf201ca Updated for Node 0.10.x
net.createServer no longer emits 'connect' event
2013-03-30 03:23:58 -07:00
Gareth Jones
83271e47fc Merge pull request #125 from jimschubert/master
Allow for somewhat standard debugging calls
2013-03-24 19:35:24 -07:00
Jim Schubert
f3271a3997 Add standard debug conditional function
: master
2013-03-23 18:50:13 -07:00
Gareth Jones
4b7cf589a2 Fixing the wiki links (issue #124) 2013-03-20 19:47:32 +11:00
Gareth Jones
c8f401c47d fixed travis node version format 2013-03-20 14:58:56 +11:00
Gareth Jones
ecbf41bc83 updated readme with node 0.10 info 2013-03-20 09:16:42 +11:00
Gareth Jones
65e490cbd2 Fixes for version v0.10 streams, breaks log4js for older versions of node 2013-03-20 09:14:27 +11:00
Nick Howes
eb21e10208 Category excluding filter.
This filtering appender allows you to choose some category
names that won't be logged to the delegated appender. This
is useful if you have e.g. a category that you use to log
web requests to one file, but want to keep those entries
out of the main log file without having to explicitly list
all the other categories that you _do_ want to include.

Has one option, "exclude", which is a category name or
array of category names. The child appender is set in
"appender", modelled on the logLevelFilter.
2013-02-26 13:27:07 +00:00
Gareth Jones
f272e3fd0a Merge branch 'master' into util.format 2013-02-25 16:43:03 +11:00
Gareth Jones
5e242c9dc9 bumped version 2013-02-25 16:33:48 +11:00
Gareth Jones
50eefcc701 Merge pull request #116 from imkira/master
Pass options from multiprocess appender to inner appender
2013-02-24 21:30:42 -08:00
Mário Freitas
8e53c6213e fix: pass options from multiprocess appender to inner appender 2013-02-21 00:06:59 +09:00
Gareth Jones
a15a628311 Merge pull request #115 from NicolasPelletier/master
Speed up file logging for high rate of logging.
2013-02-14 16:32:03 -08:00
Nicolas Pelletier
b75e3660f4 Speed up file logging for high rate of logging.
During an evaluation of multiple loggers, I saw a slow down when trying to
quickly log more than 100,000 messages to a file:
```javascript
    counter = 150000;
    while (counter) {
        logger.info('Message[' + counter + ']');
        counter -= 1;
    }
```

My detailed test can be found here:
 - https://gist.github.com/NicolasPelletier/4773843

The test demonstrate that writing 150,000 lines straight in a FileStream
takes about 22 seconds until the file content stabilizes. When calling
logger.debug() 150,000 times, the file stabilizes to its final content
after 229s ( almost 4 minutes ! ).

After investigation, it turns out that the problem is using an Array() to
accumulate the data. Pushing the data in the Array with Array.push() is
quick, but the code flushing the buffer uses Array.shift(), which forces
re-indexing of all 149,999 elements remaining in the Array. This is
exponentially slower as the buffer grows.

The solution is to use something else than an Array to accumulate the
messages. The fix was made using a package called Dequeue
( https://github.com/lleo/node-dequeue ). By replacing the Array with
a Dequeue object, it brought the logging of 150,000 messages back down to
31s. Seven times faster than the previous 229s.

There is a caveat that each log event is slightly longer due to the need
to create an object to put in the double-ended queue inside the Dequeue
object. According to a quick test, it takes about 4% more time per call
to logger.debug().
2013-02-13 09:35:02 -05:00
Gareth Jones
22da6226e5 Merge pull request #113 from bitcloud/patternLayout_tokens
add your own tokens to the patternLayout
2013-02-11 13:45:18 -08:00
Gareth Jones
c9a890b37b added some test output files to gitignore 2013-02-12 07:23:18 +11:00
Jan Schmidle
a3bdac8e14 updated require in example to match other examles 2013-02-08 16:22:29 +01:00
Jan Schmidle
af428c5669 added example on pattern tokens usage 2013-02-08 16:18:27 +01:00
Jan Schmidle
5c75ba9468 fixed small issue that could occur with wrong evaluated parameters 2013-02-08 16:17:24 +01:00
Jan Schmidle
bec0d05847 added some documentation to the function header 2013-02-08 16:15:51 +01:00
Jan Schmidle
e4bf405f20 add your own tokens to the patternLayout 2013-02-08 14:54:18 +01:00
Gareth Jones
95568f352b Merge pull request #110 from Dignifiedquire/fix-2
Move examples into their own directory.
2013-01-20 16:15:53 -08:00
Gareth Jones
6da6f3c90e Merge pull request #109 from Dignifiedquire/fix-1
Misc code highlighting fixes in readme.md
2013-01-20 14:16:04 -08:00
Friedel Ziegelmayer
7f57d14e70 Move examples into their own directory. 2013-01-19 22:14:14 +01:00
Friedel Ziegelmayer
f478793da3 Misc code highlighting fixes in readme.md 2013-01-19 22:09:31 +01:00
Gareth Jones
0dbc4921a3 Changed layouts to use util.format instead of my own implementation 2013-01-11 15:35:00 +11:00
Gareth Jones
ec2f8fec3b Merge pull request #105 from ulikoehler/readme-syntax-highlighting
Added syntax highlighting to JS code in README.md
2013-01-06 13:33:52 -08:00
Uli Köhler
0167c84ea5 Added syntax highlighting to JS code in README.md 2013-01-06 01:09:55 +01:00
Gareth Jones
3e1a27e522 New version, with colours in pattern layout 2012-12-03 09:59:36 +11:00
Gareth Jones
8b42e46071 Merge pull request #101 from Dignifiedquire/feature-color-pattern
[feature] Add patternColoured Layout.
2012-12-02 14:51:27 -08:00
Friedel Ziegelmayer
4a7a90ed53 [feature] Add color option to pattern layout.
Based on #90 this implements the possibillity to add the color codes
according to the log level via %[ and %].
2012-12-02 23:41:59 +01:00
Gareth Jones
a9307fd6da fix for issue #100, multiprocess appender and logLevelFilter don't play nicely 2012-11-09 16:02:16 +11:00
Gareth Jones
4739c65c68 Version 0.5.4 2012-10-16 11:54:21 +11:00
Gareth Jones
892181f88f Merge pull request #98 from danbell/master
Check environment variable LOG4JS_CONFIG for configuration file location.
2012-10-15 17:52:08 -07:00
Daniel Bell
bdfa7f9a9b Delete LOG4JS_CONFIG environment variable after test has finished. 2012-10-16 10:55:30 +11:00
Daniel Bell
ad63b801f7 Check environment variable LOG4JS_CONFIG for configuration file location. 2012-10-16 08:36:26 +11:00
Gareth Jones
2bfad6362a Version 0.5.3 2012-09-26 09:49:58 +10:00
Gareth Jones
2b889fe776 Working date rolling file appender. 2012-09-25 08:16:59 +10:00
Gareth Jones
9ac61e37f4 Refactored where the exit handler gets added 2012-09-25 07:43:37 +10:00
Gareth Jones
185f343e68 Working date rolling file stream 2012-09-18 08:46:39 +10:00
Gareth Jones
be1272cd7c moved streams code around, added stub for DateRollingFileStream 2012-09-05 10:58:28 +10:00
Gareth Jones
cbc1dd32f9 fixed up some dodgy tabbing 2012-09-05 08:00:31 +10:00
Gareth Jones
a6fb26efb1 Removed mentions of pollInterval (issue #93) 2012-09-04 13:48:35 +10:00
Gareth Jones
012b0d5ed7 version 0.5.2 2012-08-14 10:47:25 +10:00
Gareth Jones
de72005e7e Fixed layout stack trace test 2012-08-14 09:44:43 +10:00
Gareth Jones
c6a0e58409 Merge pull request #89 from ixti/master
Fix possible memleak with `exit` event handlers
2012-08-13 16:32:08 -07:00
Aleksey V Zapparov
f832a2ba79 Do not assign multiple exit handlers for FA 2012-08-09 15:21:30 +02:00
Aleksey V Zapparov
3f10b68c30 Add test for amount of exit listeners in FA 2012-08-09 15:15:28 +02:00
Gareth Jones
54c311842c Merge pull request #86 from osher/patch-3
Update lib/layouts.js
2012-08-01 16:21:01 -07:00
osher
f948b5f5cd Add unit tests - layouts-test.js 2012-08-01 10:11:37 +03:00
osher
54e420eb58 Update lib/layouts.js
Errors sometimes carry additional attributes on them as part of the passed error data.
A utility that utilizes it, for example - is called 'errs', which is in use for instance 'nano' - the couch-db driver.

when only the stack is printed - all the additional information that is augmented on the error object does not sink to the log and is lost.

consider the following code:

```
//the oups throwing utility
function oups(){
  e = new Error();
  extend(
    { message    : "Oups error"
    , description: "huston, we got a problem"
    , status     : "MESS"
    , errorCode  : 991
    , arr :[1,2,3,4,{}]
    , data: 
      { c:{}
      , d:{e:{}}
      }
    }
  throw e;
}

var log = require('log4js')

try{
  oups()
} catch( e ) {
   log.error("error on oups", e );
}

```


output before the fix

```
error on oups Error: Oups error
    at repl:1:11
    at REPLServer.eval (repl.js:80:21)
    at Interface.<anonymous> (repl.js:182:12)
    at Interface.emit (events.js:67:17)
    at Interface._onLine (readline.js:162:10)
    at Interface._line (readline.js:426:8)
    at Interface._ttyWrite (readline.js:603:14)
    at ReadStream.<anonymous> (readline.js:82:12)
    at ReadStream.emit (events.js:88:20)
```


output after the fix would be

```
error on oups { [Error: My error message]
  name: 'Error',
  description: 'huston, we got a problem',
  status: 'MESS',
  errorCode: 991,
  arr: [ 1, 2, 3, 4, {} ],
  data: { c: {}, d: { e: {} } } }
Error: Oups error
    at repl:1:11
    at REPLServer.eval (repl.js:80:21)
    at Interface.<anonymous> (repl.js:182:12)
    at Interface.emit (events.js:67:17)
    at Interface._onLine (readline.js:162:10)
    at Interface._line (readline.js:426:8)
    at Interface._ttyWrite (readline.js:603:14)
    at ReadStream.<anonymous> (readline.js:82:12)
    at ReadStream.emit (events.js:88:20)
```
2012-07-31 14:32:03 +03:00
Gareth Jones
40ba24a55d Renamed tests so that vows will pick them up automatically 2012-07-31 14:52:36 +10:00
Gareth Jones
e3a20a1746 bumped npm version 2012-07-04 09:28:56 +10:00
Gareth Jones
7a02f39921 Fallback to \n if os.EOL is not defined 2012-07-04 09:25:08 +10:00
Gareth Jones
b6ba3bce00 Merge branch 'master' of https://github.com/nomiddlename/log4js-node 2012-07-04 09:11:07 +10:00
Gareth Jones
638ce187bb use os.EOL instead of \n 2012-07-04 08:53:09 +10:00
Gareth Jones
3cbae96a97 Changed multiprocess appender to use a single socket per client 2012-07-04 08:45:20 +10:00
Gareth Jones
a33e48cb07 Changed multiprocess appender to use a single socket per client 2012-07-04 08:44:50 +10:00
Gareth Jones
df491c0b14 Changed multiprocess appender to use a single socket per client 2012-07-04 08:44:16 +10:00
Gareth Jones
6ff1a2499f removed 0.7 added 0.8 2012-07-04 08:33:06 +10:00
Gareth Jones
ce2d7df8df Merge pull request #78 from druciak/smtp
SMTP appender migrated to nodemailer 0.3.x
2012-06-28 18:09:04 -07:00
Gareth Jones
1b12265800 Merge branch 'master' of https://github.com/nomiddlename/log4js-node 2012-06-29 10:53:38 +10:00
Gareth Jones
32e9045334 added explanation of console appender 2012-06-29 09:38:23 +10:00
Gareth Jones
1aed671137 added fromreadme.js example, updated README 2012-06-29 09:37:41 +10:00
Gareth Jones
68b47dd51c expanded example to include loading appender programmatically 2012-06-29 09:19:20 +10:00
Gareth Jones
8f9b4444f6 made sure example works with categories 2012-06-29 09:05:18 +10:00
Gareth Jones
e49f7107fb example now works 2012-06-29 09:01:42 +10:00
druciak
077302c772 SMTP appender migrated to nodemailer 0.3.x 2012-06-27 18:00:32 +02:00
Gareth Jones
6f0dfa0c5f Added note about console.log replacement. 2012-06-04 09:18:58 +10:00
Gareth Jones
82a6bee331 Fixed the wiki links. 2012-06-01 18:15:55 +10:00
Gareth Jones
ad7e844d68 bumped npm version 2012-06-01 18:13:00 +10:00
Gareth Jones
bef2075c60 moved some docs to the wiki 2012-06-01 18:12:30 +10:00
Gareth Jones
a046523804 Moved Logger into separate file, added support for loading appenders outside log4js, removed 'name' from appender requirements 2012-06-01 11:11:07 +10:00
Gareth Jones
0ed1a137d6 moved Logger class out of main module 2012-05-31 08:16:22 +10:00
Gareth Jones
33a92b5dd6 Removed some exports that are no longer needed 2012-05-31 08:07:45 +10:00
Gareth Jones
0901794b35 Moved abspath option checking into file appender, log4js options now passed to appenders 2012-05-31 07:50:01 +10:00
Gareth Jones
05d5265554 updated hook.io version, was breaking travis build 2012-05-29 16:59:26 +10:00
Gareth Jones
9a29d6222e changed minimum node version to 0.6 2012-05-29 16:52:50 +10:00
Gareth Jones
38a89dcf3d manually merged TooTallNate's pull request #62 2012-05-29 16:49:12 +10:00
Gareth Jones
754ac2c5ac changed config loading to be more predictable 2012-05-29 15:50:35 +10:00
Gareth Jones
ccc4976206 updated node versions for travis 2012-05-09 16:52:02 +10:00
Gareth Jones
6e7348f8d8 all tests pass 2012-05-09 16:48:52 +10:00
Gareth Jones
61078e88ef fixed the nolog tests 2012-05-09 16:40:27 +10:00
Gareth Jones
613a077a61 fixed test-configureNoLevels 2012-05-09 16:31:01 +10:00
Gareth Jones
68d1c8fa07 Merge pull request #69 from NetDevLtd/feature/setLevelAsymmetry
setLevel vs isLevelEnabled asymmetry
2012-05-08 16:38:55 -07:00
Gareth Jones
216937637d Merge pull request #70 from NetDevLtd/feature/configureNoLevels
log4js.configure({}) resets all loggers' levels to TRACE
2012-05-08 16:37:23 -07:00
Mike Bardzinski
ff5b8d2939 Added vows test for the log4js.configure inconsistency, when no 'levels' property is passed in the configuration 2012-05-08 19:19:33 +01:00
Mike Bardzinski
6a20efb965 Added vows tests for the setLevel asymmetry fix 2012-05-08 12:23:30 +01:00
Mike Bardzinski
872bc791c7 Fixes the log4js.configure({}) issue which zapped all loggers' levels to TRACE, even if they were previously set to something else 2012-05-02 16:10:20 +01:00
Mike Bardzinski
2c7b56853b Changed toLevel to accept a Log4js.Level (or in fact any object), and try to convert it to a Log4js.Level. Fixes the setLevel asymmetry, where you cannot setLevel(log4js.level.foo) 2012-05-02 15:41:32 +01:00
Gareth Jones
c8157cef5c fixed file appender tests 2012-03-22 09:34:41 +11:00
Gareth Jones
352653dcbe increased the wait for file open, think it is what's breaking travis build 2012-03-20 13:55:38 +11:00
Gareth Jones
cff6928761 bumped npm version 2012-03-20 09:39:56 +11:00
Gareth Jones
1fb8962b83 turned off debug in streams (issue #63) 2012-03-20 09:39:15 +11:00
Gareth Jones
d276bbc2f8 Bumped version number, added travis status to readme 2012-02-22 14:37:45 +11:00
Gareth Jones
e78f4e33ce Fixed issue #51, added tests to cover levels 2012-02-22 14:14:46 +11:00
Gareth Jones
53367785b4 got rid of the __preLog4js stuff from the console.log replacement 2012-02-22 08:53:28 +11:00
Gareth Jones
cff20b99e3 added more gelf tests 2012-02-13 08:54:35 +11:00
Gareth Jones
0a422e5749 fixed up gelf tests 2012-02-10 18:14:50 +11:00
Gareth Jones
37b94cf195 Merge pull request #59 from shripadk/master
Allow passing cwd (__dirname) as an option.
2012-02-09 20:27:29 -08:00
Shripad K
0c04c6807c More fixes + Test for "cwd" option 2012-02-08 10:25:14 +05:30
Shripad K
b4ca201a91 feature: allow passing cwd as an option 2012-02-07 12:41:10 +05:30
Gareth Jones
2ab6f5fa24 Merge pull request #56 from arifamirani/master
Fixed tests for gelf appender
2012-01-15 14:36:55 -08:00
Arif Amirani
9bad070b8a Changed tests to not use live udp server as it fails on CI 2012-01-13 13:00:53 +05:30
Gareth Jones
5aaa9fcd50 Merge pull request #54 from arifamirani/master
Add support for GELF logging using UDP
2012-01-12 15:00:11 -08:00
Arif Amirani
b7e77b11ad Fixed some spacing to make README more legible 2012-01-12 15:02:19 +05:30
Arif Amirani
615b534b56 Added README for gelf appender 2012-01-12 15:00:34 +05:30
Arif Amirani
788de0cac3 Added basic tests for gelf appender 2012-01-12 14:52:55 +05:30
Gareth Jones
4d484ad752 Merge pull request #53 from vincentcr/master
make restoreConsole work
2012-01-11 14:44:16 -08:00
Arif Amirani
449893fd24 Added missing dependency on compress-buffer 2012-01-11 16:13:42 +05:30
Arif Amirani
5bdeaf68d7 Adding gelf as an appender 2012-01-11 16:12:24 +05:30
Vincent Côté-Roy
a5b09b3ead fix restoreConsole by making console appender not depend on _preLog4js_log 2012-01-05 08:47:15 -05:00
Daniel Bell
05c4c59c20 Refactored streams to make it easier to write other rolling based file appenders. 2011-12-22 14:36:30 +11:00
Gareth Jones
b4a5227fc0 Merge pull request #49 from Pita/patch-1
Fixed a BUG that prevents connectlogger from working if loglevel is WARN
2011-12-19 15:10:29 -08:00
Gareth Jones
b152618dbc made the file tests more robust 2011-12-20 09:59:02 +11:00
Gareth Jones
a999d8fc00 Fixed the file appender tests 2011-12-20 08:49:21 +11:00
Gareth Jones
78de73a274 Working version of fully-async log rolling file appender - tests need fixing though 2011-12-19 16:58:21 +11:00
Peter 'Pita' Martischka
4cf1d1cfa4 Fixed a BUG that prevents connectlogger from working if loglevel is WARN 2011-12-07 15:28:35 +01:00
Gareth Jones
e5d0b3348f bumped version 2011-11-24 08:40:12 +11:00
Gareth Jones
f10a6e164e windows throws an EEXIST error when renaming, need to handle it 2011-11-24 08:37:05 +11:00
Gareth Jones
cea3dc97d1 Changes to handle drain events not fired on write in linux & windows - should fix issue #44 2011-11-24 08:20:33 +11:00
Gareth Jones
a3a0c55322 version 0.4.0 2011-11-21 16:17:46 +11:00
Gareth Jones
51d48165fd Added travis-ci.org config 2011-11-21 15:07:35 +11:00
Gareth Jones
7d50c45801 Rewrote file appender, fixing issue #16 and issue #31 2011-11-21 15:03:51 +11:00
Gareth Jones
40c5f5ee70 added methods and config to turn off console.log replacement (issue #34) 2011-11-18 08:44:04 +11:00
Gareth Jones
1d769fdf33 added build and node_modules 2011-11-16 08:40:26 +11:00
Gareth Jones
bc665b875e vows seems to have removed assert.length, replaced with assert.equal 2011-11-16 08:39:07 +11:00
Gareth Jones
154c0dc299 changed web->url in bugs (issue #41) 2011-11-16 08:21:44 +11:00
Gareth Jones
050fae5230 replaced 'sys' with 'util' (issue #42) 2011-11-16 08:10:20 +11:00
Gareth Jones
342286e062 Merge pull request #40 from druciak/smtp
SMTP appender
2011-11-10 14:06:27 -08:00
druciak
537f1058b9 Add SMTP appender 2011-11-08 08:56:21 +01:00
Gareth Jones
283a403a11 Merge pull request #37 from dbrain/master
Multiprocess (tcp) appender
2011-11-02 16:03:53 -07:00
Danny Brain
ae8aaa5376 Add a short description on using multiprocess logger 2011-11-03 09:16:38 +11:00
Danny Brain
a95117c0d3 Add tests for multiprocess file appender 2011-11-03 09:10:02 +11:00
Danny Brain
097390bc89 Add multiprocess appender, pending tests 2011-11-02 15:49:46 +11:00
Gareth Jones
0a0119300b Merge pull request #32 from dbrain/master
hook.io appender should accept all configuration
2011-10-30 14:47:29 -07:00
Gareth Jones
fde66f92f5 Merge branch 'master' of https://github.com/csausdev/log4js-node 2011-10-31 08:42:58 +11:00
muddydixon
516659f733 add test code for no log 2011-10-29 11:57:28 +09:00
muddydixon
5aabebbdb7 change check target from req.url to req.originalUrl 2011-10-29 11:55:46 +09:00
Danny Brain
8b376eb46e Buffer the logging until the hook is ready, will prevent lost logs 2011-10-28 10:50:28 +11:00
Danny Brain
ced570413c Pass in all appender parameters to the Hook constructor so a port can be specified 2011-10-28 10:07:48 +11:00
Gareth Jones
b2827076da Merge pull request #30 from dbrain/master
hook.io appender
2011-10-27 15:22:35 -07:00
Danny Brain
07e920cc1b Quick check to make sure the actualAppender gets the right configuration 2011-10-27 16:43:55 +11:00
Danny Brain
89f3659825 Fix the logLevelFilter with lazy loading 2011-10-27 16:37:11 +11:00
Danny Brain
23a2758a6d Lazy load any new style appenders 2011-10-27 16:25:38 +11:00
Danny Brain
25aa075fad Basic (ugly) test 2011-10-27 16:03:06 +11:00
Danny Brain
d099a9fc3f Update readme to describe hook.io usage 2011-10-27 13:16:42 +11:00
Danny Brain
7bc460e8e0 Update readme to describe hook.io usage 2011-10-27 13:14:29 +11:00
Danny Brain
681decf51f Update readme to describe hook.io usage 2011-10-27 13:14:10 +11:00
Danny Brain
b93691b82a Update readme to describe hook.io usage 2011-10-27 13:13:22 +11:00
Danny Brain
f82ecf8f2a Update readme to describe hook.io usage 2011-10-27 13:12:36 +11:00
Danny Brain
3b77a42706 Added a hookio appender, this allows you to run a 'master' log4js instance and 'worker' so only one process writes to file 2011-10-27 12:38:13 +11:00
muddydixon
b5bc9c8322 mod if nolog 2011-10-25 14:28:46 +09:00
muddydixon
c7d3ac4fe1 add nolog operation 2011-10-25 14:09:41 +09:00
Daniel Bell
0aca64623e Merged changes from danbell/master. 2011-10-05 15:03:08 +11:00
Daniel Bell
ff68e46858 Merged changes 2011-10-05 12:27:33 +11:00
Daniel Bell
f9768eb56e Issue #21: fixed reloading of config when config has not changed. 2011-10-05 12:22:31 +11:00
Gareth Jones
75e5584060 Merge pull request #24 from cliffano/master
Add sandboxed-module to devDependencies
2011-09-14 18:15:28 -07:00
Cliffano Subagio
b78fd77015 Add sandboxed-module to dev dependencies. 2011-09-15 11:03:54 +10:00
Gareth Jones
2a06048114 added ignore files 2011-09-15 08:28:12 +10:00
Gareth Jones
9a34d9edfd fixed missing space between log data elements 2011-09-15 08:18:24 +10:00
Gareth Jones
12e71bda4e fixed to work with node 0.5.x 2011-09-15 08:13:04 +10:00
Gareth Jones
53a481d4da Added filtering to appender loader - was choking on .svn files 2011-08-11 16:27:37 +10:00
Gareth Jones
8d7b5513fb bumped version number 2011-07-27 21:22:13 +10:00
Gareth Jones
d13b2fb3b4 turned off config file reloading by default 2011-07-27 21:21:43 +10:00
Gareth Jones
4f7d73bc97 bumped version number 2011-07-27 10:37:30 +10:00
Gareth Jones
163db0e5fd fixed the behaviour of maxlogsize + 0 backups 2011-07-26 18:40:41 +10:00
Gareth Jones
71f9eef6fe Merge pull request #20 from danbell/master
Added ability to reload configuration file periodically.
2011-07-25 18:16:36 -07:00
Daniel Bell
623bc1859f Merged Gareth's latest changes in 2011-07-26 11:11:27 +10:00
Gareth Jones
b72182c0cf bumped version number 2011-07-26 09:10:02 +10:00
Gareth Jones
ef9fe3a4b1 All tests pass, moved appenders into separate files, so that extra ones can be added easily 2011-07-26 08:52:40 +10:00
Daniel Bell
3b241095cb Fixed indentation on markdown file. 2011-07-25 13:16:56 +10:00
Gareth Jones
545681287f working fileappender, with tests, broken everything else 2011-07-24 21:58:02 +10:00
Gareth Jones
80474c6881 got log rolling working, need to fix all the tests 2011-07-22 18:25:55 +10:00
Gareth Jones
7aa076c278 removed the annoying extra new line 2011-07-22 18:25:26 +10:00
Daniel Bell
e6b69ff7f2 Added more documentation on new functionality. 2011-07-22 15:59:17 +10:00
Daniel Bell
69e64932b1 Added functionality to reload configuration file periodically. 2011-07-22 14:43:33 +10:00
Gareth Jones
4b32456db7 fixed a bug where if the first log arg was not a string it wouldn't get logged 2011-07-22 12:28:02 +10:00
Gareth Jones
ec21ec63f0 bumped version number 2011-07-21 20:44:04 +10:00
Gareth Jones
a9a698cf09 fixed log rolling problem 2011-07-21 20:42:14 +10:00
Gareth Jones
925c280c68 check for existence of destroySoon (does not exist in node v0.2.x) 2011-07-21 19:09:22 +10:00
Gareth Jones
d0b4563ba0 fixed small bug checking for stack on undefined object 2011-07-20 19:39:54 +10:00
Gareth Jones
aac8ca0eb0 updated npm version number 2011-07-19 09:44:47 +10:00
Gareth Jones
0968c6709f fixed connect-logger 2011-07-19 09:08:15 +10:00
50 changed files with 3600 additions and 1881 deletions

12
.bob.json Normal file
View File

@@ -0,0 +1,12 @@
{
"build": "clean lint test coverage",
"lint": {
"type": "jshint"
},
"coverage": {
"type": "mocha-istanbul"
},
"test": {
"type": "mocha"
}
}

6
.gitignore vendored Normal file
View File

@@ -0,0 +1,6 @@
*.log
*.log??
build
node_modules
.bob/
test/streams/test-rolling-file-stream*

18
.jshintrc Normal file
View File

@@ -0,0 +1,18 @@
{
"node": true,
"laxcomma": true,
"indent": 2,
"globalstrict": true,
"maxparams": 5,
"maxdepth": 3,
"maxstatements": 20,
"maxcomplexity": 5,
"maxlen": 100,
"globals": {
"describe": true,
"it": true,
"before": true,
"beforeEach": true,
"after": true
}
}

2
.npmignore Normal file
View File

@@ -0,0 +1,2 @@
*.log
*.log??

5
.travis.yml Normal file
View File

@@ -0,0 +1,5 @@
language: node_js
node_js:
- "0.10"
- "0.8"

38
0.7-changes Normal file
View File

@@ -0,0 +1,38 @@
changes
=======
LogEvent.categoryName -> LogEvent.category
Logger is immutable (no setLevel any more)
Log levels defined in configure call, nowhere else
References to Loggers not retained
Clustered appender, multiprocess appender removed - core handles clusters now
Default category needs to be defined, with appender
connect logger, gelf, smtp, hookio appenders removed from core.
reload configuration removed from core - use 'watchr' or something instead
appenders now only need to provide configure function
log4js.configure now only takes single argument (no options)
tests use mocha not vows
replaced my debug lib with tjholowaychuk's debug (more of a standard)
options.cwd removed - filenames should always be specified in full, not relative
loglevelfilter changed to accept a list of log levels it allows
appenders that wrap other appenders must reference them by name
extracted streams to streamroller
extracted date_format.js to date-format
console.log replacement has been removed.
to-do
=====
documentation pages (gh-pages)
* configuration
* file appenders
* layouts
* optional components
* writing your own appender (use couchdb as example)
readme
* getting started
* typical config - file with max size, file with date rolling
* optional components
fix and publish the optional components
* connect
* smtp
* gelf
* hookio ?

159
README.md
View File

@@ -1,93 +1,130 @@
# log4js-node
# log4js-node [![Build Status](https://secure.travis-ci.org/nomiddlename/log4js-node.png?branch=master)](http://travis-ci.org/nomiddlename/log4js-node)
This is a conversion of the [log4js](http://log4js.berlios.de/index.html)
framework to work with [node](http://nodejs.org). I've mainly stripped out the browser-specific code
and tidied up some of the javascript. It includes a basic file logger, with log rolling based on file size, and also replaces node's console.log functions.
NOTE: in v0.2.x require('log4js') returned a function, and you needed to call that function in your code before you could use it. This was to make testing easier. v0.3.x make use of [felixge's sandbox-module](https://github.com/felixge/node-sandboxed-module), so we don't need to return a function.
This was a conversion of the [log4js](http://log4js.berlios.de/index.html)
framework to work with [node](http://nodejs.org). It's changed a lot since then, but there are still plenty of the original parts involved.
Out of the box it supports the following features:
* coloured console logging
* file appender, with log rolling based on file size or date
* multi-process logging (works fine with node's clusters)
* configurable log message layout/patterns
* different log levels for different log categories (make some parts of your app log as DEBUG, others only ERRORS, etc.)
NOTE: There have been a lot of changes in version 0.7.x, if you're upgrading from an older version, you should read [0.7-changes](http://github.com/nomiddlename/log4js-node/0.7-changes)
## installation
npm install log4js
npm install log4js
## tests
Tests now use [vows](http://vowsjs.org), run with `vows test/*.js`.
## usage
Minimalist version:
var log4js = require('log4js');
var logger = log4js.getLogger();
logger.debug("Some debug messages");
var log4js = require('log4js');
var logger = log4js.getLogger();
logger.debug("Some debug messages");
By default, log4js outputs to stdout with the coloured layout (thanks to [masylum](http://github.com/masylum)), so for the above you would see:
[2010-01-17 11:43:37.987] [DEBUG] [default] - Some debug messages
[2010-01-17 11:43:37.987] [DEBUG] default - Some debug messages
See example.js:
See the examples directory for lots of sample setup and usage code.
var log4js = require('log4js'); //note the need to call the function
log4js.addAppender(log4js.consoleAppender());
log4js.addAppender(log4js.fileAppender('logs/cheese.log'), 'cheese');
var logger = log4js.getLogger('cheese');
logger.setLevel('ERROR');
logger.trace('Entering cheese testing');
logger.debug('Got cheese.');
logger.info('Cheese is Gouda.');
logger.warn('Cheese is quite smelly.');
logger.error('Cheese is too ripe!');
logger.fatal('Cheese was breeding ground for listeria.');
## API
Log4js exposes two public functions: `configure` and `getLogger`. If
you're writing your own appender, your code will get access to some
internal APIs, see
[writing-appenders](http://github.com/nomiddlename/log4js-node/writing-appenders.md).
### log4js.configure(config)
Configure takes a single argument. If that argument is a string, it is
considered the path to a JSON file containing the configuration
object. If the argument is an object, it must have the following
fields:
* `appenders` (Object) - this should be a map of named appenders to
their configuration. At least one appender must be defined.
* `categories` (Object) - this should be a map of logger categories to
their levels and configuration. The "default" logger category must
be defined, as this is used to route all log events that do not have
an explicit category defined in the config. Category objects have
two fields:
* `level` - (String) the log level for that category: "trace",
"debug", "info", "warn", "error", "fatal", "off"
* `appenders` - (Array) the list of appender names to which log
events for this category should be sent
The default configuration for log4js, the one used if `configure` is
not called, looks like this:
{
"appenders": {
"console": { "type": "console" }
},
"categories": {
"default": { level: "TRACE", appenders: [ "console" ] }
}
}
Use of the default configuration can be overridden by setting the
`LOG4JS_CONFIG` environment variable to the location of a JSON
configuration file. log4js will use this file in preference to the
defaults, if `configure` is not called. An example file can be found
in `test/log4js.json`. An example config file with log rolling is in
`test/with-log-rolling.json`.
### log4js.getLogger([category])
* `category` (String), optional. Category to use for log events
generated by the Logger.
Output:
Returns a Logger instance. Unlike in previous versions, log4js
does not hold a reference to Loggers so feel free to use as many as
you like.
[2010-01-17 11:43:37.987] [ERROR] cheese - Cheese is too ripe!
[2010-01-17 11:43:37.990] [FATAL] cheese - Cheese was breeding ground for listeria.
### Logger
## configuration
Loggers provide the following functions:
You can either configure the appenders and log levels manually (as above), or provide a
configuration file (`log4js.configure('path/to/file.json')`) explicitly, or just let log4js look for a file called `log4js.json` (it looks in the current directory first, then the require paths, and finally looks for the default config included in the same directory as the `log4js.js` file).
An example file can be found in `test/log4js.json`. An example config file with log rolling is in `test/with-log-rolling.json`
You can also pass an object to the configure function, which has the same properties as the json versions.
* `trace`
* `debug`
* `info`
* `warn`
* `error`
* `fatal`
## connect/express logger
All can take a variable list of arguments which are used to construct
a log event. They work the same way as console.log, so you can pass a
format string with placeholders. e.g.
A connect/express logger has been added to log4js, by [danbell](https://github.com/danbell). This allows connect/express servers to log using log4js. See example-connect-logger.js.
var log4js = require('./lib/log4js');
log4js.addAppender(log4js.consoleAppender());
log4js.addAppender(log4js.fileAppender('cheese.log'), 'cheese');
var logger = log4js.getLogger('cheese');
logger.debug("number of widgets is %d", widgets);
logger.setLevel('INFO');
var app = require('express').createServer();
app.configure(function() {
app.use(log4js.connectLogger(logger, { level: log4js.levels.INFO }));
});
app.get('/', function(req,res) {
res.send('hello world');
});
app.listen(5000);
The options object that is passed to log4js.connectLogger supports a format string the same as the connect/express logger. For example:
## Appenders
app.configure(function() {
app.use(log4js.connectLogger(logger, { level: log4js.levels.INFO, format: ':method :url' }));
});
Log4js comes with file appenders included, which can be configured to
roll over based on a time or a file size. Other appenders are
available as separate modules:
## author (of this node version)
* [log4js-gelf](http://github.com/nomiddlename/log4js-gelf)
* [log4js-smtp](http://github.com/nomiddlename/log4js-smtp)
* [log4js-hookio](http://github.com/nomiddlename/log4js-hookio)
Gareth Jones (csausdev - gareth.jones@sensis.com.au)
There's also
[log4js-connect](http://github.com/nomiddlename/log4s-connect), for
logging http access in connect-based servers, like express.
## Documentation
See the [wiki](https://github.com/nomiddlename/log4js-node/wiki). Improve the [wiki](https://github.com/nomiddlename/log4js-node/wiki), please.
## Contributing
Contributions welcome, but take a look at the [rules](https://github.com/nomiddlename/log4js-node/wiki/Contributing) first.
## License
The original log4js was distributed under the Apache 2.0 License, and so is this. I've tried to
keep the original copyright and author credits in place, except in sections that I have rewritten
keep the original copyright and author credits in place, except in sections that I have rewritten
extensively.

View File

@@ -1,14 +0,0 @@
var log4js = require('./lib/log4js');
log4js.addAppender(log4js.fileAppender('cheese.log'), 'cheese');
var logger = log4js.getLogger('cheese');
logger.setLevel('INFO');
var app = require('express').createServer();
app.configure(function() {
app.use(log4js.connectLogger(logger, { level: log4js.levels.INFO }));
});
app.get('/', function(req,res) {
res.send('hello world');
});
app.listen(5000);

View File

@@ -1,22 +0,0 @@
var log4js = require('./lib/log4js');
//log the cheese logger messages to a file, and the console ones as well.
log4js.addAppender(log4js.fileAppender('cheese.log'), 'cheese', 'console');
var logger = log4js.getLogger('cheese');
//only errors and above get logged.
logger.setLevel('ERROR');
//console logging methds have been replaced with log4js ones.
console.error("AAArgh! Something went wrong", { some: "otherObject", useful_for: "debug purposes" });
//these will not appear (logging level beneath error)
logger.trace('Entering cheese testing');
logger.debug('Got cheese.');
logger.info('Cheese is Gouda.');
logger.warn('Cheese is quite smelly.');
//these end up on the console and in cheese.log
logger.error('Cheese %s is too ripe!', "gouda");
logger.fatal('Cheese was breeding ground for listeria.');

View File

@@ -0,0 +1,46 @@
//The connect/express logger was added to log4js by danbell. This allows connect/express servers to log using log4js.
//https://github.com/nomiddlename/log4js-node/wiki/Connect-Logger
// load modules
var log4js = require('log4js');
var express = require("express");
var app = express();
//config
log4js.configure({
appenders: [
{ type: 'console' },
{ type: 'file', filename: 'logs/log4jsconnect.log', category: 'log4jslog' }
]
});
//define logger
var logger = log4js.getLogger('log4jslog');
// set at which time msg is logged print like: only on error & above
// logger.setLevel('ERROR');
//express app
app.configure(function() {
app.use(express.favicon(''));
// app.use(log4js.connectLogger(logger, { level: log4js.levels.INFO }));
// app.use(log4js.connectLogger(logger, { level: 'auto', format: ':method :url :status' }));
//### AUTO LEVEL DETECTION
//http responses 3xx, level = WARN
//http responses 4xx & 5xx, level = ERROR
//else.level = INFO
app.use(log4js.connectLogger(logger, { level: 'auto' }));
});
//route
app.get('/', function(req,res) {
res.send('hello world');
});
//start app
app.listen(5000);
console.log('server runing at localhost:5000');
console.log('Simulation of normal response: goto localhost:5000');
console.log('Simulation of error response: goto localhost:5000/xxx');

View File

@@ -0,0 +1,45 @@
var log4js = require('./lib/log4js')
, cluster = require('cluster')
, numCPUs = require('os').cpus().length
, i = 0;
if (cluster.isMaster) {
log4js.configure({
appenders: [
{
type: "multiprocess",
mode: "master",
appender: {
type: "console"
}
}
]
});
console.info("Master creating %d workers", numCPUs);
for (i=0; i < numCPUs; i++) {
cluster.fork();
}
cluster.on('death', function(worker) {
console.info("Worker %d died.", worker.pid);
});
} else {
log4js.configure({
appenders: [
{
type: "multiprocess",
mode: "worker"
}
]
});
var logger = log4js.getLogger('example-socket');
console.info("Worker %d started.", process.pid);
for (i=0; i < 1000; i++) {
logger.info("Worker %d - logging something %d", process.pid, i);
}
}

58
examples/example.js Normal file
View File

@@ -0,0 +1,58 @@
var log4js = require('../lib/log4js');
//log the cheese logger messages to a file, and the console ones as well.
log4js.configure({
appenders: [
{
type: "file",
filename: "cheese.log",
category: [ 'cheese','console' ]
},
{
type: "console"
}
],
replaceConsole: true
});
//to add an appender programmatically, and without clearing other appenders
//loadAppender is only necessary if you haven't already configured an appender of this type
log4js.loadAppender('file');
log4js.addAppender(log4js.appenders.file('pants.log'), 'pants');
//a custom logger outside of the log4js/lib/appenders directory can be accessed like so
//log4js.loadAppender('what/you/would/put/in/require');
//log4js.addAppender(log4js.appenders['what/you/would/put/in/require'](args));
//or through configure as:
//log4js.configure({
// appenders: [ { type: 'what/you/would/put/in/require', otherArgs: 'blah' } ]
//});
var logger = log4js.getLogger('cheese');
//only errors and above get logged.
//you can also set this log level in the config object
//via the levels field.
logger.setLevel('ERROR');
//console logging methods have been replaced with log4js ones.
//so this will get coloured output on console, and appear in cheese.log
console.error("AAArgh! Something went wrong", { some: "otherObject", useful_for: "debug purposes" });
//these will not appear (logging level beneath error)
logger.trace('Entering cheese testing');
logger.debug('Got cheese.');
logger.info('Cheese is Gouda.');
logger.warn('Cheese is quite smelly.');
//these end up on the console and in cheese.log
logger.error('Cheese %s is too ripe!', "gouda");
logger.fatal('Cheese was breeding ground for listeria.');
//these don't end up in cheese.log, but will appear on the console
var anotherLogger = log4js.getLogger('another');
anotherLogger.debug("Just checking");
//one for pants.log
//will also go to console, since that's configured for all categories
var pantsLog = log4js.getLogger('pants');
pantsLog.debug("Something for pants");

19
examples/fromreadme.js Normal file
View File

@@ -0,0 +1,19 @@
//remember to change the require to just 'log4js' if you've npm install'ed it
var log4js = require('./lib/log4js');
//by default the console appender is loaded
//log4js.loadAppender('console');
//you'd only need to add the console appender if you
//had previously called log4js.clearAppenders();
//log4js.addAppender(log4js.appenders.console());
log4js.loadAppender('file');
log4js.addAppender(log4js.appenders.file('cheese.log'), 'cheese');
var logger = log4js.getLogger('cheese');
logger.setLevel('ERROR');
logger.trace('Entering cheese testing');
logger.debug('Got cheese.');
logger.info('Cheese is Gouda.');
logger.warn('Cheese is quite smelly.');
logger.error('Cheese is too ripe!');
logger.fatal('Cheese was breeding ground for listeria.');

27
examples/log-rolling.js Normal file
View File

@@ -0,0 +1,27 @@
var log4js = require('../lib/log4js')
, log
, i = 0;
log4js.configure({
"appenders": [
{
type: "console"
, category: "console"
},
{
"type": "file",
"filename": "tmp-test.log",
"maxLogSize": 1024,
"backups": 3,
"category": "test"
}
]
});
log = log4js.getLogger("test");
function doTheLogging(x) {
log.info("Logging something %d", x);
}
for ( ; i < 5000; i++) {
doTheLogging(i);
}

View File

@@ -0,0 +1,21 @@
var log4js = require('./lib/log4js');
var config = {
"appenders": [
{
"type": "console",
"layout": {
"type": "pattern",
"pattern": "%[%r (%x{pid}) %p %c -%] %m%n",
"tokens": {
"pid" : function() { return process.pid; }
}
}
}
]
};
log4js.configure(config, {});
var logger = log4js.getLogger("app");
logger.info("Test log message");

43
examples/smtp-appender.js Normal file
View File

@@ -0,0 +1,43 @@
//Note that smtp appender needs nodemailer to work.
//If you haven't got nodemailer installed, you'll get cryptic
//"cannot find module" errors when using the smtp appender
var log4js = require('../lib/log4js')
, log
, logmailer
, i = 0;
log4js.configure({
"appenders": [
{
type: "console",
category: "test"
},
{
"type": "smtp",
"recipients": "logfilerecipient@logging.com",
"sendInterval": 5,
"transport": "SMTP",
"SMTP": {
"host": "smtp.gmail.com",
"secureConnection": true,
"port": 465,
"auth": {
"user": "someone@gmail",
"pass": "********************"
},
"debug": true
},
"category": "mailer"
}
]
});
log = log4js.getLogger("test");
logmailer = log4js.getLogger("mailer");
function doTheLogging(x) {
log.info("Logging something %d", x);
logmailer.info("Logging something %d", x);
}
for ( ; i < 500; i++) {
doTheLogging(i);
}

21
lib/appenders/console.js Normal file
View File

@@ -0,0 +1,21 @@
"use strict";
var consoleLog = console.log.bind(console);
module.exports = function(layouts, levels) {
function consoleAppender (layout) {
layout = layout || layouts.colouredLayout;
return function(loggingEvent) {
consoleLog(layout(loggingEvent));
};
}
return function configure(config) {
var layout;
if (config.layout) {
layout = layouts.layout(config.layout.type, config.layout);
}
return consoleAppender(layout);
};
};

53
lib/appenders/dateFile.js Normal file
View File

@@ -0,0 +1,53 @@
"use strict";
var streams = require('streamroller')
, path = require('path')
, os = require('os')
, eol = os.EOL || '\n'
, openFiles = [];
//close open files on process exit.
process.on('exit', function() {
openFiles.forEach(function (file) {
file.end();
});
});
module.exports = function(layouts, levels) {
/**
* File appender that rolls files according to a date pattern.
* @filename base filename.
* @pattern the format that will be added to the end of filename when rolling,
* also used to check when to roll files - defaults to '.yyyy-MM-dd'
* @layout layout function for log messages - defaults to basicLayout
*/
function appender(filename, pattern, alwaysIncludePattern, layout) {
layout = layout || layouts.basicLayout;
var logFile = new streams.DateRollingFileStream(
filename,
pattern,
{ alwaysIncludePattern: alwaysIncludePattern }
);
openFiles.push(logFile);
return function(logEvent) {
logFile.write(layout(logEvent) + eol, "utf8");
};
}
return function configure(config) {
var layout;
if (config.layout) {
layout = layouts.layout(config.layout.type, config.layout);
}
if (!config.alwaysIncludePattern) {
config.alwaysIncludePattern = false;
}
return appender(config.filename, config.pattern, config.alwaysIncludePattern, layout);
};
};

78
lib/appenders/file.js Normal file
View File

@@ -0,0 +1,78 @@
"use strict";
var path = require('path')
, fs = require('fs')
, streams = require('streamroller')
, os = require('os')
, eol = os.EOL || '\n'
, openFiles = [];
//close open files on process exit.
process.on('exit', function() {
openFiles.forEach(function (file) {
file.end();
});
});
module.exports = function(layouts, levels) {
/**
* File Appender writing the logs to a text file. Supports rolling of logs by size.
*
* @param file file log messages will be written to
* @param layout a function that takes a logevent and returns a string
* (defaults to basicLayout).
* @param logSize - the maximum size (in bytes) for a log file,
* if not provided then logs won't be rotated.
* @param numBackups - the number of log files to keep after logSize
* has been reached (default 5)
*/
function fileAppender (file, layout, logSize, numBackups) {
var bytesWritten = 0;
file = path.normalize(file);
layout = layout || layouts.basicLayout;
numBackups = numBackups === undefined ? 5 : numBackups;
//there has to be at least one backup if logSize has been specified
numBackups = numBackups === 0 ? 1 : numBackups;
function openTheStream(file, fileSize, numFiles) {
var stream;
if (fileSize) {
stream = new streams.RollingFileStream(
file,
fileSize,
numFiles
);
} else {
stream = fs.createWriteStream(
file,
{ encoding: "utf8",
mode: parseInt('0644', 8),
flags: 'a' }
);
}
stream.on("error", function (err) {
console.error("log4js.fileAppender - Writing to file %s, error happened ", file, err);
});
return stream;
}
var logFile = openTheStream(file, logSize, numBackups);
// push file to the stack of open handlers
openFiles.push(logFile);
return function(loggingEvent) {
logFile.write(layout(loggingEvent) + eol, "utf8");
};
}
return function configure(config) {
var layout;
if (config.layout) {
layout = layouts.layout(config.layout.type, config.layout);
}
return fileAppender(config.filename, layout, config.maxLogSize, config.backups);
};
};

View File

@@ -0,0 +1,40 @@
"use strict";
var debug = require('debug')('log4js:logLevelFilter');
module.exports = function(layouts, levels) {
function logLevelFilter(allowedLevels, appender) {
return function(logEvent) {
debug("Checking ", logEvent.level, " against ", allowedLevels);
if (allowedLevels.some(function(item) { return item.level === logEvent.level.level; })) {
debug("Sending ", logEvent, " to appender ", appender);
appender(logEvent);
}
};
}
return function configure(config, appenderByName) {
if (!Array.isArray(config.allow)) {
throw new Error("No allowed log levels specified.");
}
var allowedLevels = config.allow.map(function(allowed) {
var level = levels.toLevel(allowed);
if (!level) {
throw new Error("Unrecognised log level '" + allowed + "'.");
}
return level;
});
if (allowedLevels.length === 0) {
throw new Error("No allowed log levels specified.");
}
if (!config.appender) {
throw new Error("Missing an appender.");
}
return logLevelFilter(allowedLevels, appenderByName(config.appender));
};
};

View File

@@ -1,114 +0,0 @@
var levels = require("./levels");
/**
* Log requests with the given `options` or a `format` string.
*
* Options:
*
* - `format` Format string, see below for tokens
* - `level` A log4js levels instance.
*
* Tokens:
*
* - `:req[header]` ex: `:req[Accept]`
* - `:res[header]` ex: `:res[Content-Length]`
* - `:http-version`
* - `:response-time`
* - `:remote-addr`
* - `:date`
* - `:method`
* - `:url`
* - `:referrer`
* - `:user-agent`
* - `:status`
*
* @param {String|Function|Object} format or options
* @return {Function}
* @api public
*/
function getLogger(logger4js, options) {
if ('object' == typeof options) {
options = options || {};
} else if (options) {
options = { format: options };
} else {
options = {};
}
var thislogger = logger4js
, level = levels.toLevel(options.level, levels.INFO)
, fmt = options.format || ':remote-addr - - ":method :url HTTP/:http-version" :status :content-length ":req[referer]" ":user-agent"';
return function (req, res, next) {
// mount safety
if (req._logging) return next();
if (thislogger.isLevelEnabled(level)) {
var start = +new Date
, statusCode
, writeHead = res.writeHead
, end = res.end
, url = req.originalUrl;
// flag as logging
req._logging = true;
// proxy for statusCode.
res.writeHead = function(code, headers){
res.writeHead = writeHead;
res.writeHead(code, headers);
res.__statusCode = statusCode = code;
res.__headers = headers || {};
};
// proxy end to output a line to the provided logger.
res.end = function(chunk, encoding) {
res.end = end;
res.end(chunk, encoding);
res.responseTime = +new Date - start;
if ('function' == typeof fmt) {
var line = fmt(req, res, function(str){ return format(str, req, res); });
if (line) thislogger.log(level, line);
} else {
thislogger.log(level, format(fmt, req, res));
}
};
next();
}
};
}
/**
* Return formatted log line.
*
* @param {String} str
* @param {IncomingMessage} req
* @param {ServerResponse} res
* @return {String}
* @api private
*/
function format(str, req, res) {
return str
.replace(':url', req.originalUrl)
.replace(':method', req.method)
.replace(':status', res.__statusCode || res.statusCode)
.replace(':response-time', res.responseTime)
.replace(':date', new Date().toUTCString())
.replace(':referrer', req.headers['referer'] || req.headers['referrer'] || '')
.replace(':http-version', req.httpVersionMajor + '.' + req.httpVersionMinor)
.replace(':remote-addr', req.socket && (req.socket.remoteAddress || (req.socket.socket && req.socket.socket.remoteAddress)))
.replace(':user-agent', req.headers['user-agent'] || '')
.replace(':content-length', (res._headers && res._headers['content-length']) || (res.__headers && res.__headers['Content-Length']) || '-')
.replace(/:req\[([^\]]+)\]/g, function(_, field){ return req.headers[field.toLowerCase()]; })
.replace(/:res\[([^\]]+)\]/g, function(_, field){
return res._headers
? (res._headers[field.toLowerCase()] || res.__headers[field])
: (res.__headers && res.__headers[field]);
});
}
exports.connectLogger = getLogger;

View File

@@ -1,60 +0,0 @@
exports.ISO8601_FORMAT = "yyyy-MM-dd hh:mm:ss.SSS";
exports.ISO8601_WITH_TZ_OFFSET_FORMAT = "yyyy-MM-ddThh:mm:ssO";
exports.DATETIME_FORMAT = "dd MM yyyy hh:mm:ss.SSS";
exports.ABSOLUTETIME_FORMAT = "hh:mm:ss.SSS";
exports.asString = function(/*format,*/ date) {
var format = exports.ISO8601_FORMAT;
if (typeof(date) === "string") {
format = arguments[0];
date = arguments[1];
}
var vDay = addZero(date.getDate());
var vMonth = addZero(date.getMonth()+1);
var vYearLong = addZero(date.getFullYear());
var vYearShort = addZero(date.getFullYear().toString().substring(3,4));
var vYear = (format.indexOf("yyyy") > -1 ? vYearLong : vYearShort);
var vHour = addZero(date.getHours());
var vMinute = addZero(date.getMinutes());
var vSecond = addZero(date.getSeconds());
var vMillisecond = padWithZeros(date.getMilliseconds(), 3);
var vTimeZone = offset(date);
var formatted = format
.replace(/dd/g, vDay)
.replace(/MM/g, vMonth)
.replace(/y{1,4}/g, vYear)
.replace(/hh/g, vHour)
.replace(/mm/g, vMinute)
.replace(/ss/g, vSecond)
.replace(/SSS/g, vMillisecond)
.replace(/O/g, vTimeZone);
return formatted;
function padWithZeros(vNumber, width) {
var numAsString = vNumber + "";
while (numAsString.length < width) {
numAsString = "0" + numAsString;
}
return numAsString;
}
function addZero(vNumber) {
return padWithZeros(vNumber, 2);
}
/**
* Formats the TimeOffest
* Thanks to http://www.svendtofte.com/code/date_format/
* @private
*/
function offset(date) {
// Difference to Greenwich time (GMT) in hours
var os = Math.abs(date.getTimezoneOffset());
var h = String(Math.floor(os/60));
var m = String(os%60);
h.length == 1? h = "0"+h:1;
m.length == 1? m = "0"+m:1;
return date.getTimezoneOffset() < 0 ? "+"+h+m : "-"+h+m;
}
};

View File

@@ -1,109 +1,100 @@
var dateFormat = require('./date_format')
"use strict";
var dateFormat = require('date-format')
, os = require('os')
, eol = os.EOL || '\n'
, util = require('util')
, replacementRegExp = /%[sdj]/g
, layoutMakers = {
"messagePassThrough": function() { return messagePassThroughLayout; }
, "basic": function() { return basicLayout; }
, "colored": function() { return colouredLayout; }
, "coloured": function() { return colouredLayout; }
, "pattern": function (config) {
var pattern = config.pattern || undefined;
return patternLayout(pattern);
}
"messagePassThrough": function() { return messagePassThroughLayout; },
"basic": function() { return basicLayout; },
"colored": function() { return colouredLayout; },
"coloured": function() { return colouredLayout; },
"pattern": function (config) {
return patternLayout(config && config.pattern, config && config.tokens);
}
}
, colours = {
ALL: "grey"
, TRACE: "blue"
, DEBUG: "cyan"
, INFO: "green"
, WARN: "yellow"
, ERROR: "red"
, FATAL: "magenta"
, OFF: "grey"
ALL: "grey",
TRACE: "blue",
DEBUG: "cyan",
INFO: "green",
WARN: "yellow",
ERROR: "red",
FATAL: "magenta",
OFF: "grey"
};
function formatLogData(logData) {
var output = ""
, data = Array.isArray(logData) ? logData.slice() : Array.prototype.slice.call(arguments)
, format = data.shift();
if (typeof format === "string") {
output = format.replace(replacementRegExp, function(match) {
switch (match) {
case "%s": return new String(data.shift());
case "%d": return new Number(data.shift());
case "%j": return JSON.stringify(data.shift());
default:
return match;
};
});
if (data.length > 0) {
output += '\n';
}
function wrapErrorsWithInspect(items) {
return items.map(function(item) {
if ((item instanceof Error) && item.stack) {
return { inspect: function() { return util.format(item) + '\n' + item.stack; } };
} else {
return item;
}
data.forEach(function (item) {
if (item.stack) {
output += item.stack;
} else {
output += util.inspect(item);
}
});
return output;
});
}
function formatLogData(logData) {
var data = Array.isArray(logData) ? logData : Array.prototype.slice.call(arguments);
return util.format.apply(util, wrapErrorsWithInspect(data));
}
var styles = {
//styles
'bold' : [1, 22],
'italic' : [3, 23],
'underline' : [4, 24],
'inverse' : [7, 27],
//grayscale
'white' : [37, 39],
'grey' : [90, 39],
'black' : [90, 39],
//colors
'blue' : [34, 39],
'cyan' : [36, 39],
'green' : [32, 39],
'magenta' : [35, 39],
'red' : [31, 39],
'yellow' : [33, 39]
};
function colorizeStart(style) {
return style ? '\x1B[' + styles[style][0] + 'm' : '';
}
function colorizeEnd(style) {
return style ? '\x1B[' + styles[style][1] + 'm' : '';
}
/**
* Taken from masylum's fork (https://github.com/masylum/log4js-node)
*/
function colorize (str, style) {
var styles = {
//styles
'bold' : [1, 22],
'italic' : [3, 23],
'underline' : [4, 24],
'inverse' : [7, 27],
//grayscale
'white' : [37, 39],
'grey' : [90, 39],
'black' : [90, 39],
//colors
'blue' : [34, 39],
'cyan' : [36, 39],
'green' : [32, 39],
'magenta' : [35, 39],
'red' : [31, 39],
'yellow' : [33, 39]
};
return style ? '\033[' + styles[style][0] + 'm' + str +
'\033[' + styles[style][1] + 'm' : str;
return colorizeStart(style) + str + colorizeEnd(style);
}
function timestampLevelAndCategory(loggingEvent, colour) {
var output = colorize(
formatLogData(
'[%s] [%s] %s - '
, dateFormat.asString(loggingEvent.startTime)
, loggingEvent.level
, loggingEvent.categoryName
)
, colour
);
return output;
var output = colorize(
formatLogData(
'[%s] [%s] %s - '
, dateFormat.asString(loggingEvent.startTime)
, loggingEvent.level
, loggingEvent.category
)
, colour
);
return output;
}
/**
* BasicLayout is a simple layout for storing the logs. The logs are stored
* in following format:
* <pre>
* [startTime] [logLevel] categoryName - message\n
* [startTime] [logLevel] category - message\n
* </pre>
*
* @author Stephan Strittmatter
*/
function basicLayout (loggingEvent) {
return timestampLevelAndCategory(loggingEvent) + formatLogData(loggingEvent.data);
return timestampLevelAndCategory(loggingEvent) + formatLogData(loggingEvent.data);
}
/**
@@ -111,11 +102,14 @@ function basicLayout (loggingEvent) {
* same as basicLayout, but with colours.
*/
function colouredLayout (loggingEvent) {
return timestampLevelAndCategory(loggingEvent, colours[loggingEvent.level.toString()]) + formatLogData(loggingEvent.data);
return timestampLevelAndCategory(
loggingEvent,
colours[loggingEvent.level.toString()]
) + formatLogData(loggingEvent.data);
}
function messagePassThroughLayout (loggingEvent) {
return formatLogData(loggingEvent.data);
return formatLogData(loggingEvent.data);
}
/**
@@ -126,132 +120,196 @@ function messagePassThroughLayout (loggingEvent) {
* - %r time in toLocaleTimeString format
* - %p log level
* - %c log category
* - %h hostname
* - %m log data
* - %d date in various formats
* - %% %
* - %n newline
* Takes a pattern string and returns a layout function.
* - %x{<tokenname>} add dynamic tokens to your log. Tokens are specified in the tokens parameter
* You can use %[ and %] to define a colored block.
*
* Tokens are specified as simple key:value objects.
* The key represents the token name whereas the value can be a string or function
* which is called to extract the value to put in the log message. If token is not
* found, it doesn't replace the field.
*
* A sample token would be: { "pid" : function() { return process.pid; } }
*
* Takes a pattern string, array of tokens and returns a layout function.
* @param {String} Log format pattern String
* @param {object} map object of different tokens
* @return {Function}
* @author Stephan Strittmatter
* @author Jan Schmidle
*/
function patternLayout (pattern) {
var TTCC_CONVERSION_PATTERN = "%r %p %c - %m%n";
var regex = /%(-?[0-9]+)?(\.?[0-9]+)?([cdmnpr%])(\{([^\}]+)\})?|([^%]+)/;
function patternLayout (pattern, tokens) {
var TTCC_CONVERSION_PATTERN = "%r %p %c - %m%n";
var regex = /%(-?[0-9]+)?(\.?[0-9]+)?([\[\]cdhmnprx%])(\{([^\}]+)\})?|([^%]+)/;
pattern = pattern || TTCC_CONVERSION_PATTERN;
pattern = pattern || TTCC_CONVERSION_PATTERN;
function category(loggingEvent, specifier) {
var loggerName = loggingEvent.category;
if (specifier) {
var precision = parseInt(specifier, 10);
var loggerNameBits = loggerName.split(".");
if (precision < loggerNameBits.length) {
loggerName = loggerNameBits.slice(loggerNameBits.length - precision).join(".");
}
}
return loggerName;
}
return function(loggingEvent) {
var formattedString = "";
var result;
var searchString = pattern;
var formats = {
"ISO8601": dateFormat.ISO8601_FORMAT,
"ISO8601_WITH_TZ_OFFSET": dateFormat.ISO8601_WITH_TZ_OFFSET_FORMAT,
"ABSOLUTE": dateFormat.ABSOLUTETIME_FORMAT,
"DATE": dateFormat.DATETIME_FORMAT
};
while ((result = regex.exec(searchString))) {
var matchedString = result[0];
var padding = result[1];
var truncation = result[2];
var conversionCharacter = result[3];
var specifier = result[5];
var text = result[6];
function formatAsDate(loggingEvent, specifier) {
var format = dateFormat.ISO8601_FORMAT;
if (specifier) {
format = formats[specifier] || specifier;
}
// Format the date
return dateFormat.asString(format, loggingEvent.startTime);
}
function hostname() {
return os.hostname().toString();
}
// Check if the pattern matched was just normal text
if (text) {
formattedString += "" + text;
} else {
// Create a raw replacement string based on the conversion
// character and specifier
var replacement = "";
switch(conversionCharacter) {
case "c":
var loggerName = loggingEvent.categoryName;
if (specifier) {
var precision = parseInt(specifier, 10);
var loggerNameBits = loggingEvent.categoryName.split(".");
if (precision >= loggerNameBits.length) {
replacement = loggerName;
} else {
replacement = loggerNameBits.slice(loggerNameBits.length - precision).join(".");
}
} else {
replacement = loggerName;
}
break;
case "d":
var format = dateFormat.ISO8601_FORMAT;
if (specifier) {
format = specifier;
// Pick up special cases
if (format == "ISO8601") {
format = dateFormat.ISO8601_FORMAT;
} else if (format == "ABSOLUTE") {
format = dateFormat.ABSOLUTETIME_FORMAT;
} else if (format == "DATE") {
format = dateFormat.DATETIME_FORMAT;
}
}
// Format the date
replacement = dateFormat.asString(format, loggingEvent.startTime);
break;
case "m":
replacement = formatLogData(loggingEvent.data);
break;
case "n":
replacement = "\n";
break;
case "p":
replacement = loggingEvent.level.toString();
break;
case "r":
replacement = "" + loggingEvent.startTime.toLocaleTimeString();
break;
case "%":
replacement = "%";
break;
default:
replacement = matchedString;
break;
}
// Format the replacement according to any padding or
// truncation specified
function formatMessage(loggingEvent) {
return formatLogData(loggingEvent.data);
}
function endOfLine() {
return eol;
}
var len;
function logLevel(loggingEvent) {
return loggingEvent.level.toString();
}
// First, truncation
if (truncation) {
len = parseInt(truncation.substr(1), 10);
replacement = replacement.substring(0, len);
}
// Next, padding
if (padding) {
if (padding.charAt(0) == "-") {
len = parseInt(padding.substr(1), 10);
// Right pad with spaces
while (replacement.length < len) {
replacement += " ";
}
} else {
len = parseInt(padding, 10);
// Left pad with spaces
while (replacement.length < len) {
replacement = " " + replacement;
}
}
}
formattedString += replacement;
}
searchString = searchString.substr(result.index + result[0].length);
}
return formattedString;
};
function startTime(loggingEvent) {
return "" + loggingEvent.startTime.toLocaleTimeString();
}
};
function startColour(loggingEvent) {
return colorizeStart(colours[loggingEvent.level.toString()]);
}
function endColour(loggingEvent) {
return colorizeEnd(colours[loggingEvent.level.toString()]);
}
function percent() {
return '%';
}
function userDefined(loggingEvent, specifier) {
if (typeof(tokens[specifier]) !== 'undefined') {
if (typeof(tokens[specifier]) === 'function') {
return tokens[specifier](loggingEvent);
} else {
return tokens[specifier];
}
}
return null;
}
var replacers = {
'c': category,
'd': formatAsDate,
'h': hostname,
'm': formatMessage,
'n': endOfLine,
'p': logLevel,
'r': startTime,
'[': startColour,
']': endColour,
'%': percent,
'x': userDefined
};
function replaceToken(conversionCharacter, loggingEvent, specifier) {
return replacers[conversionCharacter](loggingEvent, specifier);
}
function truncate(truncation, toTruncate) {
var len;
if (truncation) {
len = parseInt(truncation.substr(1), 10);
return toTruncate.substring(0, len);
}
return toTruncate;
}
function pad(padding, toPad) {
var len;
if (padding) {
if (padding.charAt(0) == "-") {
len = parseInt(padding.substr(1), 10);
// Right pad with spaces
while (toPad.length < len) {
toPad += " ";
}
} else {
len = parseInt(padding, 10);
// Left pad with spaces
while (toPad.length < len) {
toPad = " " + toPad;
}
}
}
return toPad;
}
return function(loggingEvent) {
var formattedString = "";
var result;
var searchString = pattern;
while ((result = regex.exec(searchString))) {
var matchedString = result[0];
var padding = result[1];
var truncation = result[2];
var conversionCharacter = result[3];
var specifier = result[5];
var text = result[6];
// Check if the pattern matched was just normal text
if (text) {
formattedString += "" + text;
} else {
// Create a raw replacement string based on the conversion
// character and specifier
var replacement =
replaceToken(conversionCharacter, loggingEvent, specifier) ||
matchedString;
// Format the replacement according to any padding or
// truncation specified
replacement = truncate(truncation, replacement);
replacement = pad(padding, replacement);
formattedString += replacement;
}
searchString = searchString.substr(result.index + result[0].length);
}
return formattedString;
};
}
module.exports = {
basicLayout: basicLayout
, messagePassThroughLayout: messagePassThroughLayout
, patternLayout: patternLayout
, colouredLayout: colouredLayout
, coloredLayout: colouredLayout
, layout: function(name, config) {
return layoutMakers[name] && layoutMakers[name](config);
}
};
basicLayout: basicLayout,
messagePassThroughLayout: messagePassThroughLayout,
patternLayout: patternLayout,
colouredLayout: colouredLayout,
coloredLayout: colouredLayout,
layout: function(name, config) {
return layoutMakers[name] && layoutMakers[name](config);
}
};

View File

@@ -1,56 +1,84 @@
"use strict";
function Level(level, levelStr) {
this.level = level;
this.levelStr = levelStr;
this.level = level;
this.levelStr = levelStr;
}
/**
* converts given String to corresponding Level
* @param {String} sArg String value of Level
* @param {String} sArg String value of Level OR Log4js.Level
* @param {Log4js.Level} defaultLevel default Level, if no String representation
* @return Level object
* @type Log4js.Level
*/
function toLevel(sArg, defaultLevel) {
if (sArg === null) {
return defaultLevel;
}
if (typeof sArg == "string") {
var s = sArg.toUpperCase();
if (module.exports[s]) {
return module.exports[s];
}
}
if (!sArg) {
return defaultLevel;
};
}
if (typeof sArg == "string") {
var s = sArg.toUpperCase();
if (module.exports[s]) {
return module.exports[s];
} else {
return defaultLevel;
}
}
return toLevel(sArg.toString());
}
Level.prototype.toString = function() {
return this.levelStr;
return this.levelStr;
};
Level.prototype.isLessThanOrEqualTo = function(otherLevel) {
function convertAndCompare(comparison) {
return function(otherLevel) {
if (typeof otherLevel === "string") {
otherLevel = Level.toLevel(otherLevel);
otherLevel = toLevel(otherLevel);
}
return comparison.call(this, otherLevel);
};
}
Level.prototype.isLessThanOrEqualTo = convertAndCompare(
function(otherLevel) {
return this.level <= otherLevel.level;
};
}
);
Level.prototype.isGreaterThanOrEqualTo = function(otherLevel) {
if (typeof otherLevel === "string") {
otherLevel = Level.toLevel(otherLevel);
}
Level.prototype.isGreaterThanOrEqualTo = convertAndCompare(
function(otherLevel) {
return this.level >= otherLevel.level;
};
}
);
module.exports = {
ALL: new Level(Number.MIN_VALUE, "ALL")
, TRACE: new Level(5000, "TRACE")
, DEBUG: new Level(10000, "DEBUG")
, INFO: new Level(20000, "INFO")
, WARN: new Level(30000, "WARN")
, ERROR: new Level(40000, "ERROR")
, FATAL: new Level(50000, "FATAL")
, OFF: new Level(Number.MAX_VALUE, "OFF")
, toLevel: toLevel
};
Level.prototype.isEqualTo = convertAndCompare(
function(otherLevel) {
return this.level === otherLevel.level;
}
);
exports.ALL = new Level(Number.MIN_VALUE, "ALL");
exports.TRACE = new Level(5000, "TRACE");
exports.DEBUG = new Level(10000, "DEBUG");
exports.INFO = new Level(20000, "INFO");
exports.WARN = new Level(30000, "WARN");
exports.ERROR = new Level(40000, "ERROR");
exports.FATAL = new Level(50000, "FATAL");
exports.OFF = new Level(Number.MAX_VALUE, "OFF");
exports.levels = [
exports.OFF,
exports.TRACE,
exports.DEBUG,
exports.INFO,
exports.WARN,
exports.ERROR,
exports.FATAL
];
exports.toLevel = toLevel;

View File

@@ -1,3 +1,4 @@
"use strict";
/*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -12,31 +13,32 @@
* limitations under the License.
*/
/*jsl:option explicit*/
/**
* @fileoverview log4js is a library to log in JavaScript in similar manner
* than in log4j for Java. The API should be nearly the same.
*
* This file contains all log4js code and is the only file required for logging.
*
* <h3>Example:</h3>
* <pre>
* var logging = require('log4js');
* //add an appender that logs all messages to stdout.
* logging.addAppender(logging.consoleAppender());
* //add an appender that logs "some-category" to a file
* logging.addAppender(logging.fileAppender("file.log"), "some-category");
* logging.configure({
* appenders: {
* "errorFile": { type: "file", filename: "error.log" }
* },
* categories: {
* "default": { level: "ERROR", appenders: [ "errorFile" ] }
* }
* });
* //get a logger
* var log = logging.getLogger("some-category");
* log.setLevel(logging.levels.TRACE); //set the Level
*
* ...
*
* //call the log
* log.trace("trace me" );
* log.error("oh noes");
* </pre>
*
* NOTE: the authors below are the original browser-based log4js authors
* don't try to contact them about bugs in this version :)
* @version 1.0
* @author Stephan Strittmatter - http://jroller.com/page/stritti
* @author Seth Chisamore - http://www.chisamore.com
@@ -44,378 +46,231 @@
* @static
* Website: http://log4js.berlios.de
*/
var events = require('events')
var debug = require('debug')('log4js:core')
, fs = require('fs')
, path = require('path')
, sys = require('sys')
, cluster = require('cluster')
, util = require('util')
, layouts = require('./layouts')
, levels = require('./levels')
, DEFAULT_CATEGORY = '[default]'
, ALL_CATEGORIES = '[all]'
, Logger = require('./logger')
, appenders = {}
, loggers = {}
, appenderMakers = {
"file": function(config, fileAppender) {
var layout;
if (config.layout) {
layout = layouts.layout(config.layout.type, config.layout);
}
return fileAppender(config.filename, layout, config.maxLogSize, config.backups, config.pollInterval);
},
"console": function(config, fileAppender, consoleAppender) {
var layout;
if (config.layout) {
layout = layouts.layout(config.layout.type, config.layout);
}
return consoleAppender(layout);
},
"logLevelFilter": function(config, fileAppender, consoleAppender) {
var appender = appenderMakers[config.appender.type](config.appender, fileAppender, consoleAppender);
return logLevelFilter(config.level, appender);
}
, categories = {}
, appenderMakers = {}
, defaultConfig = {
appenders: {
console: { type: "console" }
},
categories: {
default: { level: levels.DEBUG, appenders: [ "console" ] }
}
};
function serialise(event) {
return JSON.stringify(event);
}
function deserialise(serialised) {
var event;
try {
event = JSON.parse(serialised);
event.startTime = new Date(event.startTime);
event.level = levels.toLevel(event.level.levelStr);
} catch(e) {
event = {
startTime: new Date(),
category: 'log4js',
level: levels.ERROR,
data: [ 'Unable to parse log:', serialised ]
};
}
return event;
}
//in a multi-process node environment, worker loggers will use
//process.send
cluster.on('fork', function(worker) {
debug('listening to worker: ', worker);
worker.on('message', function(message) {
if (message.type && message.type === '::log4js-message') {
debug("received message: ", message.event);
dispatch(deserialise(message.event));
}
});
});
/**
* Get a logger instance. Instance is cached on categoryName level.
* @param {String} categoryName name of category to log to.
* Get a logger instance.
* @param {String} category to log to.
* @return {Logger} instance of logger for the category
* @static
*/
function getLogger (categoryName) {
function getLogger (category) {
debug("getLogger(", category, ")");
// Use default logger if categoryName is not specified or invalid
if (!(typeof categoryName == "string")) {
categoryName = DEFAULT_CATEGORY;
}
return new Logger(
cluster.isMaster ? dispatch : workerDispatch,
category || 'default'
);
}
var appenderList;
if (!loggers[categoryName]) {
// Create the logger for this name if it doesn't already exist
loggers[categoryName] = new Logger(categoryName);
if (appenders[categoryName]) {
appenderList = appenders[categoryName];
appenderList.forEach(function(appender) {
loggers[categoryName].addListener("log", appender);
});
}
if (appenders[ALL_CATEGORIES]) {
appenderList = appenders[ALL_CATEGORIES];
appenderList.forEach(function(appender) {
loggers[categoryName].addListener("log", appender);
});
}
}
return loggers[categoryName];
function workerDispatch(event) {
process.send({ type: "::log4js-message", event: serialise(event) });
}
/**
* args are appender, then zero or more categories
* Log event routing to appenders
* This would be a good place to implement category hierarchies/wildcards, etc
*/
function addAppender () {
var args = Array.prototype.slice.call(arguments);
var appender = args.shift();
if (args.length == 0 || args[0] === undefined) {
args = [ ALL_CATEGORIES ];
}
//argument may already be an array
if (Array.isArray(args[0])) {
args = args[0];
}
function dispatch(event) {
debug("event is ", event);
var category = categories[event.category] || categories.default;
debug(
"category.level[",
category.level,
"] <= ",
event.level,
" ? ",
category.level.isLessThanOrEqualTo(event.level)
);
args.forEach(function(category) {
if (!appenders[category]) {
appenders[category] = [];
}
appenders[category].push(appender);
if (category === ALL_CATEGORIES) {
for (var logger in loggers) {
if (loggers.hasOwnProperty(logger)) {
loggers[logger].addListener("log", appender);
}
}
} else if (loggers[category]) {
loggers[category].addListener("log", appender);
}
if (category.level.isLessThanOrEqualTo(event.level)) {
category.appenders.forEach(function(appender) {
appenders[appender](event);
});
}
}
function load(file) {
debug("loading ", file);
var contents = fs.readFileSync(file, "utf-8");
debug("file contents ", contents);
return JSON.parse(contents);
}
function configure(configurationFileOrObject) {
debug("configure(", configurationFileOrObject, ")");
debug("process.env.LOG4JS_CONFIG = ", process.env.LOG4JS_CONFIG);
var filename, config = process.env.LOG4JS_CONFIG || configurationFileOrObject;
debug("config ", config);
if (!config || !(typeof config === 'string' || typeof config === 'object')) {
throw new Error("You must specify configuration as an object or a filename.");
}
if (typeof config === 'string') {
debug("config is string");
filename = config;
config = load(filename);
}
if (!config.appenders || !Object.keys(config.appenders).length) {
throw new Error("You must specify at least one appender.");
}
configureAppenders(config.appenders);
validateCategories(config.categories);
categories = config.categories;
}
function validateCategories(cats) {
if (!cats || !cats.default) {
throw new Error("You must specify an appender for the default category");
}
Object.keys(cats).forEach(function(categoryName) {
var category = cats[categoryName], inputLevel = category.level;
if (!category.level) {
throw new Error("You must specify a level for category '" + categoryName + "'.");
}
category.level = levels.toLevel(inputLevel);
if (!category.level) {
throw new Error(
"Level '" + inputLevel +
"' is not valid for category '" + categoryName +
"'. Acceptable values are: " + levels.levels.join(', ') + "."
);
}
if (!category.appenders || !category.appenders.length) {
throw new Error("You must specify an appender for category '" + categoryName + "'.");
}
category.appenders.forEach(function(appender) {
if (!appenders[appender]) {
throw new Error(
"Appender '" + appender +
"' for category '" + categoryName +
"' does not exist. Known appenders are: " + Object.keys(appenders).join(', ') + "."
);
}
});
});
}
function clearAppenders () {
appenders = {};
for (var logger in loggers) {
if (loggers.hasOwnProperty(logger)) {
loggers[logger].removeAllListeners("log");
}
}
debug("clearing appenders and appender makers");
appenders = {};
appenderMakers = {};
}
function configureAppenders(appenderList, fileAppender, consoleAppender) {
clearAppenders();
if (appenderList) {
appenderList.forEach(function(appenderConfig) {
var appender = appenderMakers[appenderConfig.type](appenderConfig, fileAppender, consoleAppender);
if (appender) {
addAppender(appender, appenderConfig.category);
} else {
throw new Error("log4js configuration problem for "+sys.inspect(appenderConfig));
}
});
} else {
addAppender(consoleAppender);
}
function appenderByName(name) {
if (appenders.hasOwnProperty(name)) {
return appenders[name];
} else {
throw new Error("Appender '" + name + "' not found.");
}
}
function configureLevels(levels) {
if (levels) {
for (var category in levels) {
if (levels.hasOwnProperty(category)) {
getLogger(category).setLevel(levels[category]);
}
}
}
}
/**
* Models a logging event.
* @constructor
* @param {String} categoryName name of category
* @param {Log4js.Level} level level of message
* @param {Array} data objects to log
* @param {Log4js.Logger} logger the associated logger
* @author Seth Chisamore
*/
function LoggingEvent (categoryName, level, data, logger) {
this.startTime = new Date();
this.categoryName = categoryName;
this.data = data;
this.level = level;
this.logger = logger;
}
/**
* Logger to log messages.
* use {@see Log4js#getLogger(String)} to get an instance.
* @constructor
* @param name name of category to log to
* @author Stephan Strittmatter
*/
function Logger (name, level) {
this.category = name || DEFAULT_CATEGORY;
if (! this.level) {
this.__proto__.level = levels.TRACE;
}
}
sys.inherits(Logger, events.EventEmitter);
Logger.prototype.setLevel = function(level) {
this.level = levels.toLevel(level, levels.TRACE);
};
Logger.prototype.removeLevel = function() {
delete this.level;
};
Logger.prototype.log = function(logLevel, args) {
var data = Array.prototype.slice.call(args)
, loggingEvent = new LoggingEvent(this.category, logLevel, data, this);
this.emit("log", loggingEvent);
};
Logger.prototype.isLevelEnabled = function(otherLevel) {
return this.level.isLessThanOrEqualTo(otherLevel);
};
['Trace','Debug','Info','Warn','Error','Fatal'].forEach(
function(levelString) {
var level = levels.toLevel(levelString);
Logger.prototype['is'+levelString+'Enabled'] = function() {
return this.isLevelEnabled(level);
};
Logger.prototype[levelString.toLowerCase()] = function () {
if (this.isLevelEnabled(level)) {
this.log(level, arguments);
}
};
}
);
function setGlobalLogLevel(level) {
Logger.prototype.level = levels.toLevel(level, levels.TRACE);
}
/**
* Get the default logger instance.
* @return {Logger} instance of default logger
* @static
*/
function getDefaultLogger () {
return getLogger(DEFAULT_CATEGORY);
}
function logLevelFilter (levelString, appender) {
var level = levels.toLevel(levelString);
return function(logEvent) {
if (logEvent.level.isGreaterThanOrEqualTo(level)) {
appender(logEvent);
}
}
}
function consoleAppender (layout) {
layout = layout || layouts.colouredLayout;
return function(loggingEvent) {
console._preLog4js_log(layout(loggingEvent));
};
}
/**
* File Appender writing the logs to a text file. Supports rolling of logs by size.
*
* @param file file log messages will be written to
* @param layout a function that takes a logevent and returns a string (defaults to basicLayout).
* @param logSize - the maximum size (in bytes) for a log file, if not provided then logs won't be rotated.
* @param numBackups - the number of log files to keep after logSize has been reached (default 5)
* @param filePollInterval - the time in seconds between file size checks (default 30s)
*/
function fileAppender (file, layout, logSize, numBackups, filePollInterval) {
layout = layout || layouts.basicLayout;
var logFile = fs.createWriteStream(file, { flags: 'a', mode: 0644, encoding: 'utf8' });
if (logSize > 0) {
setupLogRolling(logFile, file, logSize, numBackups || 5, (filePollInterval * 1000) || 30000);
}
//close the file on process exit.
process.on('exit', function() {
logFile.end();
logFile.destroySoon();
});
return function(loggingEvent) {
logFile.write(layout(loggingEvent)+'\n');
};
}
function setupLogRolling (logFile, filename, logSize, numBackups, filePollInterval) {
fs.watchFile(
filename,
{
persistent: false,
interval: filePollInterval
},
function (curr, prev) {
if (curr.size >= logSize) {
rollThatLog(logFile, filename, numBackups);
}
}
);
}
function rollThatLog (logFile, filename, numBackups) {
//first close the current one.
logFile.end();
logFile.destroySoon();
//roll the backups (rename file.n-1 to file.n, where n <= numBackups)
for (var i=numBackups; i > 0; i--) {
if (i > 1) {
if (fileExists(filename + '.' + (i-1))) {
fs.renameSync(filename+'.'+(i-1), filename+'.'+i);
}
} else {
fs.renameSync(filename, filename+'.1');
}
}
//open it up again
logFile = fs.createWriteStream(filename, { flags: 'a', mode: 0644, encoding: "utf8" });
}
function fileExists (filename) {
function configureAppenders(appenderMap) {
clearAppenders();
Object.keys(appenderMap).forEach(function(appenderName) {
var appender, appenderConfig = appenderMap[appenderName];
loadAppender(appenderConfig.type);
try {
fs.statSync(filename);
return true;
appenders[appenderName] = appenderMakers[appenderConfig.type](
appenderConfig,
appenderByName
);
} catch(e) {
throw new Error(
"log4js configuration problem for appender '" + appenderName +
"'. Error was " + e.stack
);
}
});
}
function loadAppender(appender) {
var appenderModule;
if (!appenderMakers[appender]) {
debug("Loading appender ", appender);
try {
appenderModule = require('./appenders/' + appender);
} catch (e) {
return false;
try {
debug("Appender ", appender, " is not a core log4js appender.");
appenderModule = require(appender);
} catch (err) {
debug("Error loading appender %s: ", appender, err);
throw new Error("Could not load appender of type '" + appender + "'.");
}
}
appenderMakers[appender] = appenderModule(layouts, levels);
}
}
function configure (configurationFileOrObject) {
var config = configurationFileOrObject;
if (typeof(config) === "string") {
config = JSON.parse(fs.readFileSync(config, "utf8"));
}
if (config) {
try {
configureAppenders(config.appenders, fileAppender, consoleAppender);
configureLevels(config.levels);
} catch (e) {
throw new Error("Problem reading log4js config " + sys.inspect(config) + ". Error was \"" + e.message + "\" ("+e.stack+")");
}
}
}
function findConfiguration() {
//add current directory onto the list of configPaths
var paths = ['.'].concat(require.paths);
//add this module's directory to the end of the list, so that we pick up the default config
paths.push(__dirname);
var pathsWithConfig = paths.filter( function (pathToCheck) {
try {
fs.statSync(path.join(pathToCheck, "log4js.json"));
return true;
} catch (e) {
return false;
}
});
if (pathsWithConfig.length > 0) {
return path.join(pathsWithConfig[0], 'log4js.json');
}
return undefined;
}
function replaceConsole(logger) {
function replaceWith(fn) {
return function() {
fn.apply(logger, arguments);
}
}
['log','debug','info','warn','error'].forEach(function (item) {
console['_preLog4js_'+item] = console[item];
console[item] = replaceWith(item === 'log' ? logger.info : logger[item]);
});
}
//set ourselves up if we can find a default log4js.json
configure(findConfiguration());
//replace console.log, etc with log4js versions
replaceConsole(getLogger("console"));
module.exports = {
getLogger: getLogger,
getDefaultLogger: getDefaultLogger,
addAppender: addAppender,
clearAppenders: clearAppenders,
configure: configure,
levels: levels,
setGlobalLogLevel: setGlobalLogLevel,
consoleAppender: consoleAppender,
fileAppender: fileAppender,
logLevelFilter: logLevelFilter,
layouts: layouts,
connectLogger: require('./connect-logger').connectLogger(this)
getLogger: getLogger,
configure: configure
};
//keep the old-style layouts
['basicLayout','messagePassThroughLayout','colouredLayout','coloredLayout'].forEach(function(item) {
module.exports[item] = layouts[item];
});
//set ourselves up
debug("Starting configuration");
configure(defaultConfig);

View File

@@ -1,7 +0,0 @@
{
"appenders": [
{
"type": "console"
}
]
}

49
lib/logger.js Normal file
View File

@@ -0,0 +1,49 @@
"use strict";
var debug = require('debug')('log4js:logger')
, levels = require('./levels');
module.exports = function Logger(dispatch, category) {
if (typeof dispatch !== 'function') {
throw new Error("Logger must have a dispatch delegate.");
}
if (!category) {
throw new Error("Logger must have a category.");
}
function log() {
var args = Array.prototype.slice.call(arguments)
, logLevel = args.shift()
, loggingEvent = new LoggingEvent(category, logLevel, args);
debug("Logging event ", loggingEvent, " to dispatch = ", dispatch);
dispatch(loggingEvent);
}
var self = this;
['trace','debug','info','warn','error','fatal'].forEach(
function(level) {
self[level] = function() {
var args = Array.prototype.slice.call(arguments);
args.unshift(level);
log.apply(this, args);
};
}
);
};
/**
* Models a logging event.
* @constructor
* @param {String} category name of category
* @param {Log4js.Level} level level of message
* @param {Array} data objects to log
* @author Seth Chisamore
*/
function LoggingEvent (category, level, data) {
this.startTime = new Date();
this.category = category;
this.data = data;
this.level = levels.toLevel(level);
}

View File

@@ -1,27 +1,44 @@
{
"name": "log4js",
"version": "0.3.0",
"description": "Port of Log4js to work with node.",
"keywords": [
"logging",
"log",
"log4j",
"node"
],
"main": "./lib/log4js",
"author": "Gareth Jones <gareth.jones@sensis.com.au>",
"bugs": {
"web": "http://github.com/csausdev/log4js-node/issues"
},
"engines": [ "node >=0.1.100" ],
"scripts": {
"test": "vows test/*.js"
},
"directories": {
"test": "test",
"lib": "lib"
},
"devDependencies": {
"vows": ">=0.5.2"
}
"name": "log4js",
"version": "0.7.0",
"description": "Port of Log4js to work with node.",
"keywords": [
"logging",
"log",
"log4j",
"node"
],
"main": "./lib/log4js",
"author": "Gareth Jones <gareth.jones@sensis.com.au>",
"repository": {
"type": "git",
"url": "https://github.com/nomiddlename/log4js-node.git"
},
"bugs": {
"url": "http://github.com/nomiddlename/log4js-node/issues"
},
"engines": {
"node": ">=0.8"
},
"scripts": {
"test": "mocha --recursive"
},
"directories": {
"test": "test",
"lib": "lib"
},
"dependencies": {
"debug": "~0.7.2",
"streamroller": "0.0.1",
"date-format": "0.0.0"
},
"devDependencies": {
"async": "0.1.15",
"sandboxed-module": "0.1.3",
"mocha": "~1.12.0",
"should": "~1.2.2"
},
"browser": {
"os": false
}
}

View File

@@ -0,0 +1,99 @@
'use strict';
var async = require('async')
, should = require('should')
, fs = require('fs')
, path = require('path')
, assert = require('assert');
function remove() {
var files = Array.prototype.slice.call(arguments);
return function(done) {
async.forEach(
files.map(function(file) { return path.join(__dirname, file); }),
fs.unlink.bind(fs),
function() { done(); }
);
};
}
describe('log4js', function() {
before(
remove(
'test-category-filter-web.log',
'test-category-filter-all.log'
)
);
after(
remove(
'test-category-filter-web.log',
'test-category-filter-all.log'
)
);
describe('category filtering', function() {
before(function() {
var log4js = require('../lib/log4js')
, webLogger = log4js.getLogger("web")
, appLogger = log4js.getLogger("app");
log4js.configure({
appenders: {
rest: {
type: "file",
layout: { type: "messagePassThrough" },
filename: path.join(__dirname, "test-category-filter-all.log")
},
web: {
type: "file",
layout: { type: "messagePassThrough"},
filename: path.join(__dirname, "test-category-filter-web.log")
}
},
categories: {
"default": { level: "debug", appenders: [ "rest" ] },
web: { level: "debug", appenders: [ "web" ] }
}
});
webLogger.debug('This should get logged');
appLogger.debug('This should not');
webLogger.debug('Hello again');
log4js.getLogger('db').debug('This shouldn\'t be included by the appender anyway');
});
it('should only pass matching category', function(done) {
setTimeout(function() {
fs.readFile(
path.join(__dirname, 'test-category-filter-web.log'),
'utf8',
function(err, contents) {
var lines = contents.trim().split('\n');
lines.should.eql(["This should get logged", "Hello again"]);
done(err);
}
);
}, 50);
});
it('should send everything else to default appender', function(done) {
setTimeout(function() {
fs.readFile(
path.join(__dirname, 'test-category-filter-all.log'),
'utf8',
function(err, contents) {
var lines = contents.trim().split('\n');
lines.should.eql([
"This should not",
"This shouldn't be included by the appender anyway"
]);
done(err);
}
);
}, 50);
});
});
});

147
test/clusteredAppender-test.js Executable file
View File

@@ -0,0 +1,147 @@
"use strict";
var should = require('should')
, sandbox = require('sandboxed-module');
describe('log4js in a cluster', function() {
describe('when in master mode', function() {
var log4js
, clusterOnFork = false
, workerCb
, events = []
, worker = {
on: function(evt, cb) {
evt.should.eql('message');
this.cb = cb;
}
};
before(function() {
log4js = sandbox.require(
'../lib/log4js',
{
requires: {
'cluster': {
isMaster: true,
on: function(evt, cb) {
evt.should.eql('fork');
clusterOnFork = true;
cb(worker);
}
},
'./appenders/console': function() {
return function() {
return function(event) {
events.push(event);
};
};
}
}
}
);
});
it('should listen for fork events', function() {
clusterOnFork.should.eql(true);
});
it('should listen for messages from workers', function() {
//workerCb was created in a different context to the test
//(thanks to sandbox.require), so doesn't pick up the should prototype
(typeof worker.cb).should.eql('function');
});
it('should log valid ::log4js-message events', function() {
worker.cb({
type: '::log4js-message',
event: JSON.stringify({
startTime: '2010-10-10 18:54:06',
category: 'cheese',
level: { levelStr: 'DEBUG' },
data: [ "blah" ]
})
});
events.should.have.length(1);
events[0].data[0].should.eql("blah");
events[0].category.should.eql('cheese');
//startTime was created in a different context to the test
//(thanks to sandbox.require), so instanceof doesn't think
//it's a Date.
events[0].startTime.constructor.name.should.eql('Date');
events[0].level.toString().should.eql('DEBUG');
});
it('should handle invalid ::log4js-message events', function() {
worker.cb({
type: '::log4js-message',
event: "biscuits"
});
worker.cb({
type: '::log4js-message',
event: JSON.stringify({
startTime: 'whatever'
})
});
events.should.have.length(3);
events[1].data[0].should.eql('Unable to parse log:');
events[1].data[1].should.eql('biscuits');
events[1].category.should.eql('log4js');
events[1].level.toString().should.eql('ERROR');
events[2].data[0].should.eql('Unable to parse log:');
events[2].data[1].should.eql(JSON.stringify({ startTime: 'whatever'}));
});
it('should ignore other events', function() {
worker.cb({
type: "::blah-blah",
event: "blah"
});
events.should.have.length(3);
});
});
describe('when in worker mode', function() {
var log4js, events = [];
before(function() {
log4js = sandbox.require(
'../lib/log4js',
{
requires: {
'cluster': {
isMaster: false,
on: function() {}
}
},
globals: {
'process': {
'send': function(event) {
events.push(event);
},
'env': {
}
}
}
}
);
log4js.getLogger('test').debug("just testing");
});
it('should emit ::log4js-message events', function() {
events.should.have.length(1);
events[0].type.should.eql('::log4js-message');
events[0].event.should.be.a('string');
var evt = JSON.parse(events[0].event);
evt.category.should.eql('test');
evt.level.levelStr.should.eql('DEBUG');
evt.data[0].should.eql('just testing');
});
});
});

View File

@@ -0,0 +1,31 @@
"use strict";
var should = require('should')
, sandbox = require('sandboxed-module');
describe('../lib/appenders/console', function() {
var messages = [];
before(function() {
var fakeConsole = {
log: function(msg) { messages.push(msg); }
}
, appenderModule = sandbox.require(
'../lib/appenders/console',
{
globals: {
'console': fakeConsole
}
}
)
, appender = appenderModule(require('../lib/layouts'))(
{ layout: { type: "messagePassThrough" } }
);
appender({ data: ["blah"] });
});
it('should output to console', function() {
messages.should.eql(["blah"]);
});
});

View File

@@ -0,0 +1,221 @@
"use strict";
/*jshint expr:true */
var should = require('should')
, async = require('async')
, path = require('path')
, fs = require('fs')
, sandbox = require('sandboxed-module');
function remove(filename, cb) {
fs.unlink(path.join(__dirname, filename), function(err) {
cb();
});
}
describe('../lib/appenders/dateFile', function() {
describe('adding multiple dateFileAppenders', function() {
var files = [], initialListeners;
before(function() {
var dateFileAppender = require('../lib/appenders/dateFile')({ basicLayout: function() {} }),
count = 5,
logfile;
initialListeners = process.listeners('exit').length;
while (count--) {
logfile = path.join(__dirname, 'datefa-default-test' + count + '.log');
dateFileAppender({
filename: logfile
});
files.push(logfile);
}
});
after(function(done) {
async.forEach(files, remove, done);
});
it('should only add one `exit` listener', function () {
process.listeners('exit').length.should.be.below(initialListeners + 2);
});
});
describe('exit listener', function() {
var openedFiles = [];
before(function() {
var exitListener
, dateFileAppender = sandbox.require(
'../lib/appenders/dateFile',
{
globals: {
process: {
on: function(evt, listener) {
exitListener = listener;
}
}
},
requires: {
'streamroller': {
DateRollingFileStream: function(filename) {
openedFiles.push(filename);
this.end = function() {
openedFiles.shift();
};
}
}
}
}
)({ basicLayout: function() {} });
for (var i=0; i < 5; i += 1) {
dateFileAppender({
filename: 'test' + i
});
}
openedFiles.should.not.be.empty;
exitListener();
});
it('should close all open files', function() {
openedFiles.should.be.empty;
});
});
describe('with default settings', function() {
var contents;
before(function(done) {
var testFile = path.join(__dirname, 'date-appender-default.log'),
log4js = require('../lib/log4js'),
logger = log4js.getLogger('default-settings');
log4js.configure({
appenders: {
"date": { type: "dateFile", filename: testFile }
},
categories: {
default: { level: "debug", appenders: [ "date" ] }
}
});
logger.info("This should be in the file.");
setTimeout(function() {
fs.readFile(testFile, "utf8", function(err, data) {
contents = data;
done(err);
});
}, 100);
});
after(function(done) {
remove('date-appender-default.log', done);
});
it('should write to the file', function() {
contents.should.include('This should be in the file');
});
it('should use the basic layout', function() {
contents.should.match(
/\[\d{4}-\d{2}-\d{2}\s\d{2}:\d{2}:\d{2}\.\d{3}\] \[INFO\] default-settings - /
);
});
});
describe('configure', function() {
describe('with dateFileAppender', function() {
var contents;
before(function(done) {
var log4js = require('../lib/log4js')
, logger = log4js.getLogger('tests');
//this config file defines one file appender (to ./date-file-test.log)
//and sets the log level for "tests" to WARN
log4js.configure('test/with-dateFile.json');
logger.info('this should not be written to the file');
logger.warn('this should be written to the file');
fs.readFile(path.join(__dirname, 'date-file-test.log'), 'utf8', function(err, data) {
contents = data;
done(err);
});
});
after(function(done) {
remove('date-file-test.log', done);
});
it('should load appender configuration from a json file', function() {
contents.should.include('this should be written to the file' + require('os').EOL);
contents.should.not.include('this should not be written to the file');
});
});
describe('with options.alwaysIncludePattern', function() {
var contents, thisTime;
before(function(done) {
var log4js = require('../lib/log4js')
, format = require('date-format')
, logger
, options = {
"appenders": {
"datefile": {
"type": "dateFile",
"filename": "test/date-file-test",
"pattern": "-from-MM-dd.log",
"alwaysIncludePattern": true,
"layout": {
"type": "messagePassThrough"
}
}
},
categories: { default: { level: "debug", appenders: [ "datefile" ] } }
};
thisTime = format.asString(options.appenders.datefile.pattern, new Date());
fs.writeFile(
path.join(__dirname, 'date-file-test' + thisTime),
"this is existing data" + require('os').EOL,
'utf8',
function(err) {
log4js.configure(options);
logger = log4js.getLogger('tests');
logger.warn('this should be written to the file with the appended date');
//wait for filesystem to catch up
setTimeout(function() {
fs.readFile(
path.join(__dirname, 'date-file-test' + thisTime),
'utf8',
function(err, data) {
contents = data;
done(err);
}
);
}, 200);
}
);
});
after(function(done) {
remove('date-file-test' + thisTime, done);
});
it('should create file with the correct pattern', function() {
contents.should.include('this should be written to the file with the appended date');
});
it('should not overwrite the file on open (bug found in issue #132)', function() {
contents.should.include('this is existing data');
});
});
});
});

View File

@@ -1,23 +0,0 @@
var vows = require('vows')
, assert = require('assert')
, dateFormat = require('../lib/date_format');
vows.describe('date_format').addBatch({
'Date extensions': {
topic: function() {
return new Date(2010, 0, 11, 14, 31, 30, 5);
},
'should format a date as string using a pattern': function(date) {
assert.equal(
dateFormat.asString(dateFormat.DATETIME_FORMAT, date),
"11 01 2010 14:31:30.005"
);
},
'should default to the ISO8601 format': function(date) {
assert.equal(
dateFormat.asString(date),
'2010-01-11 14:31:30.005'
);
}
}
}).export(module);

335
test/fileAppender-test.js Normal file
View File

@@ -0,0 +1,335 @@
"use strict";
/*jshint expr:true */
var fs = require('fs')
, async = require('async')
, path = require('path')
, sandbox = require('sandboxed-module')
, log4js = require('../lib/log4js')
, should = require('should');
function remove(filename, cb) {
fs.unlink(filename, function(err) { cb(); });
}
describe('log4js fileAppender', function() {
describe('adding multiple fileAppenders', function() {
var files = [], initialCount, listenersCount;
before(function() {
var logfile
, count = 5
, config = {
appenders: {},
categories: { default: { level: "debug", appenders: ["file0"] } }
};
initialCount = process.listeners('exit').length;
while (count--) {
logfile = path.join(__dirname, '/fa-default-test' + count + '.log');
config.appenders["file" + count] = { type: "file", filename: logfile };
files.push(logfile);
}
log4js.configure(config);
listenersCount = process.listeners('exit').length;
});
after(function(done) {
async.forEach(files, remove, done);
});
it('does not add more than one `exit` listeners', function () {
listenersCount.should.be.below(initialCount + 2);
});
});
describe('exit listener', function() {
var openedFiles = [];
before(function() {
var exitListener
, fileAppender = sandbox.require(
'../lib/appenders/file',
{
globals: {
process: {
on: function(evt, listener) {
exitListener = listener;
}
}
},
requires: {
'streamroller': {
RollingFileStream: function(filename) {
openedFiles.push(filename);
this.end = function() {
openedFiles.shift();
};
this.on = function() {};
}
}
}
}
)(require('../lib/layouts'));
for (var i=0; i < 5; i += 1) {
fileAppender({ filename: 'test' + i, maxLogSize: 100 });
}
openedFiles.should.not.be.empty;
exitListener();
});
it('should close all open files', function() {
openedFiles.should.be.empty;
});
});
describe('with default fileAppender settings', function() {
var fileContents
, testFile = path.join(__dirname, '/fa-default-test.log');
before(function(done) {
var logger = log4js.getLogger('default-settings');
remove(testFile, function() {
log4js.configure({
appenders: {
"file": { type: "file", filename: testFile }
},
categories: {
"default": { level: "debug", appenders: [ "file" ] }
}
});
logger.info("This should be in the file.");
setTimeout(function() {
fs.readFile(testFile, "utf8", function(err, contents) {
if (!err) {
fileContents = contents;
}
done(err);
});
}, 100);
});
});
after(function(done) {
remove(testFile, done);
});
it('should write log messages to the file', function() {
fileContents.should.include("This should be in the file.\n");
});
it('log messages should be in the basic layout format', function() {
fileContents.should.match(
/\[\d{4}-\d{2}-\d{2}\s\d{2}:\d{2}:\d{2}\.\d{3}\] \[INFO\] default-settings - /
);
});
});
describe('with a max file size and no backups', function() {
var testFile = path.join(__dirname, '/fa-maxFileSize-test.log');
before(function(done) {
var logger = log4js.getLogger('max-file-size');
async.forEach([
testFile,
testFile + '.1'
], remove, function() {
//log file of 100 bytes maximum, no backups
log4js.configure({
appenders: {
"file": { type: "file", filename: testFile, maxLogSize: 100, backups: 0 }
},
categories: {
"default": { level: "debug", appenders: [ "file" ] }
}
});
logger.info("This is the first log message.");
logger.info("This is an intermediate log message.");
logger.info("This is the second log message.");
done();
});
});
after(function(done) {
async.forEach([ testFile, testFile + '.1' ], remove, done);
});
describe('log file', function() {
it('should only contain the second message', function(done) {
//wait for the file system to catch up
setTimeout(function() {
fs.readFile(testFile, "utf8", function(err, fileContents) {
fileContents.should.include("This is the second log message.\n");
fileContents.should.not.include("This is the first log message.");
done(err);
});
}, 100);
});
});
describe('the number of files starting with the test file name', function() {
it('should be two', function(done) {
fs.readdir(__dirname, function(err, files) {
//there will always be one backup if you've specified a max log size
var logFiles = files.filter(
function(file) { return file.indexOf('fa-maxFileSize-test.log') > -1; }
);
logFiles.should.have.length(2);
done(err);
});
});
});
});
describe('with a max file size and 2 backups', function() {
var testFile = path.join(__dirname, '/fa-maxFileSize-with-backups-test.log');
before(function(done) {
var logger = log4js.getLogger('max-file-size-backups');
async.forEach([
testFile,
testFile+'.1',
testFile+'.2'
], remove, function() {
//log file of 50 bytes maximum, 2 backups
log4js.configure({
appenders: {
"file": { type: "file", filename: testFile, maxLogSize: 50, backups: 2 }
},
categories: {
"default": { level: "debug", appenders: [ "file" ] }
}
});
logger.info("This is the first log message.");
logger.info("This is the second log message.");
logger.info("This is the third log message.");
logger.info("This is the fourth log message.");
done();
});
});
describe('the log files', function() {
var logFiles;
before(function(done) {
setTimeout(function() {
fs.readdir(__dirname, function(err, files) {
if (files) {
logFiles = files.sort().filter(
function(file) {
return file.indexOf('fa-maxFileSize-with-backups-test.log') > -1;
}
);
done(null);
} else {
done(err);
}
});
}, 200);
});
after(function(done) {
async.forEach(logFiles, remove, done);
});
it('should be 3', function () {
logFiles.should.have.length(3);
});
it('should be named in sequence', function() {
logFiles.should.eql([
'fa-maxFileSize-with-backups-test.log',
'fa-maxFileSize-with-backups-test.log.1',
'fa-maxFileSize-with-backups-test.log.2'
]);
});
describe('and the contents of the first file', function() {
it('should be the last log message', function(done) {
fs.readFile(path.join(__dirname, logFiles[0]), "utf8", function(err, contents) {
contents.should.include('This is the fourth log message.');
done(err);
});
});
});
describe('and the contents of the second file', function() {
it('should be the third log message', function(done) {
fs.readFile(path.join(__dirname, logFiles[1]), "utf8", function(err, contents) {
contents.should.include('This is the third log message.');
done(err);
});
});
});
describe('and the contents of the third file', function() {
it('should be the second log message', function(done) {
fs.readFile(path.join(__dirname, logFiles[2]), "utf8", function(err, contents) {
contents.should.include('This is the second log message.');
done(err);
});
});
});
});
});
describe('when underlying stream errors', function() {
var consoleArgs;
before(function() {
var errorHandler
, fileAppender = sandbox.require(
'../lib/appenders/file',
{
globals: {
console: {
error: function() {
consoleArgs = Array.prototype.slice.call(arguments);
}
}
},
requires: {
'streamroller': {
RollingFileStream: function(filename) {
this.end = function() {};
this.on = function(evt, cb) {
if (evt === 'error') {
errorHandler = cb;
}
};
}
}
}
}
)(require('../lib/layouts'));
fileAppender({
filename: 'test1.log',
maxLogSize: 100
});
errorHandler({ error: 'aargh' });
});
it('should log the error to console.error', function() {
consoleArgs.should.not.be.empty;
consoleArgs[0].should.eql('log4js.fileAppender - Writing to file %s, error happened ');
consoleArgs[1].should.eql('test1.log');
consoleArgs[2].error.should.eql('aargh');
});
});
});

350
test/layouts-test.js Normal file
View File

@@ -0,0 +1,350 @@
"use strict";
var assert = require('assert');
//used for patternLayout tests.
function test(layout, event, tokens, pattern, value) {
assert.equal(layout(pattern, tokens)(event), value);
}
describe('log4js layouts', function() {
describe('colouredLayout', function() {
var layout = require('../lib/layouts').colouredLayout;
it('should apply level colour codes to output', function() {
var output = layout({
data: ["nonsense"],
startTime: new Date(2010, 11, 5, 14, 18, 30, 45),
category: "cheese",
level: {
toString: function() { return "ERROR"; }
}
});
assert.equal(output, '\x1B[31m[2010-12-05 14:18:30.045] [ERROR] cheese - \x1B[39mnonsense');
});
it('should support the console.log format for the message', function() {
var output = layout({
data: ["thing %d", 2],
startTime: new Date(2010, 11, 5, 14, 18, 30, 45),
category: "cheese",
level: {
toString: function() { return "ERROR"; }
}
});
assert.equal(output, '\x1B[31m[2010-12-05 14:18:30.045] [ERROR] cheese - \x1B[39mthing 2');
});
});
describe('messagePassThroughLayout', function() {
var layout = require('../lib/layouts').messagePassThroughLayout;
it('should take a logevent and output only the message', function() {
assert.equal(layout({
data: ["nonsense"],
startTime: new Date(2010, 11, 5, 14, 18, 30, 45),
category: "cheese",
level: {
colour: "green",
toString: function() { return "ERROR"; }
}
}), "nonsense");
});
it('should support the console.log format for the message', function() {
assert.equal(layout({
data: ["thing %d", 1, "cheese"],
startTime: new Date(2010, 11, 5, 14, 18, 30, 45),
category: "cheese",
level : {
colour: "green",
toString: function() { return "ERROR"; }
}
}), "thing 1 cheese");
});
it('should output the first item even if it is not a string', function() {
assert.equal(layout({
data: [ { thing: 1} ],
startTime: new Date(2010, 11, 5, 14, 18, 30, 45),
category: "cheese",
level: {
colour: "green",
toString: function() { return "ERROR"; }
}
}), "{ thing: 1 }");
});
it('should print the stacks of a passed error objects', function() {
assert.ok(Array.isArray(
layout({
data: [ new Error() ],
startTime: new Date(2010, 11, 5, 14, 18, 30, 45),
category: "cheese",
level: {
colour: "green",
toString: function() { return "ERROR"; }
}
}).match(
/Error\s+at Context\..*\s+\((.*)test[\\\/]layouts-test\.js\:\d+\:\d+\)\s/
)
), 'regexp did not return a match');
});
describe('with passed augmented errors', function() {
var layoutOutput;
before(function() {
var e = new Error("My Unique Error Message");
e.augmented = "My Unique attribute value";
e.augObj = { at1: "at2" };
layoutOutput = layout({
data: [ e ],
startTime: new Date(2010, 11, 5, 14, 18, 30, 45),
category: "cheese",
level: {
colour: "green",
toString: function() { return "ERROR"; }
}
});
});
it('should print error the contained error message', function() {
var m = layoutOutput.match(/\{ \[Error: My Unique Error Message\]/);
assert.ok(Array.isArray(m));
});
it('should print error augmented string attributes', function() {
var m = layoutOutput.match(/augmented:\s'My Unique attribute value'/);
assert.ok(Array.isArray(m));
});
it('should print error augmented object attributes', function() {
var m = layoutOutput.match(/augObj:\s\{ at1: 'at2' \}/);
assert.ok(Array.isArray(m));
});
});
});
describe('basicLayout', function() {
var layout = require('../lib/layouts').basicLayout
, event = {
data: ['this is a test'],
startTime: new Date(2010, 11, 5, 14, 18, 30, 45),
category: "tests",
level: {
toString: function() { return "DEBUG"; }
}
};
it('should take a logevent and output a formatted string', function() {
assert.equal(layout(event), "[2010-12-05 14:18:30.045] [DEBUG] tests - this is a test");
});
it('should output a stacktrace, message if the event has an error attached', function() {
var output
, lines
, error = new Error("Some made-up error")
, stack = error.stack.split(/\n/);
event.data = ['this is a test', error];
output = layout(event);
lines = output.split(/\n/);
assert.equal(lines.length - 1, stack.length);
assert.equal(
lines[0],
"[2010-12-05 14:18:30.045] [DEBUG] tests - this is a test [Error: Some made-up error]"
);
for (var i = 1; i < stack.length; i++) {
assert.equal(lines[i+2], stack[i+1]);
}
});
it('should output any extra data in the log event as util.inspect strings', function() {
var output, lines;
event.data = ['this is a test', {
name: 'Cheese',
message: 'Gorgonzola smells.'
}];
output = layout(event);
assert.equal(
output,
"[2010-12-05 14:18:30.045] [DEBUG] tests - this is a test " +
"{ name: 'Cheese', message: 'Gorgonzola smells.' }"
);
});
});
describe('patternLayout', function() {
var event = {
data: ['this is a test'],
startTime: new Date(2010, 11, 5, 14, 18, 30, 45),
category: "multiple.levels.of.tests",
level: {
toString: function() { return "DEBUG"; }
}
}
, layout = require('../lib/layouts').patternLayout
, tokens = {
testString: 'testStringToken',
testFunction: function() { return 'testFunctionToken'; },
fnThatUsesLogEvent: function(logEvent) { return logEvent.level.toString(); }
};
event.startTime.getTimezoneOffset = function() { return 0; };
it('should default to "time logLevel loggerName - message"', function() {
test(
layout,
event,
tokens,
null,
"14:18:30 DEBUG multiple.levels.of.tests - this is a test\n"
);
});
it('%r should output time only', function() {
test(layout, event, tokens, '%r', '14:18:30');
});
it('%p should output the log level', function() {
test(layout, event, tokens, '%p', 'DEBUG');
});
it('%c should output the log category', function() {
test(layout, event, tokens, '%c', 'multiple.levels.of.tests');
});
it('%m should output the log data', function() {
test(layout, event, tokens, '%m', 'this is a test');
});
it('%n should output a new line', function() {
test(layout, event, tokens, '%n', '\n');
});
it('%h should output hostname', function() {
test(layout, event, tokens, '%h', require('os').hostname().toString());
});
it('%c should handle category names like java-style package names', function() {
test(layout, event, tokens, '%c{1}', 'tests');
test(layout, event, tokens, '%c{2}', 'of.tests');
test(layout, event, tokens, '%c{3}', 'levels.of.tests');
test(layout, event, tokens, '%c{4}', 'multiple.levels.of.tests');
test(layout, event, tokens, '%c{5}', 'multiple.levels.of.tests');
test(layout, event, tokens, '%c{99}', 'multiple.levels.of.tests');
});
it('%d should output the date in ISO8601 format', function() {
test(layout, event, tokens, '%d', '2010-12-05 14:18:30.045');
});
it('%d should allow for format specification', function() {
test(layout, event, tokens, '%d{ISO8601_WITH_TZ_OFFSET}', '2010-12-05T14:18:30-0000');
test(layout, event, tokens, '%d{ISO8601}', '2010-12-05 14:18:30.045');
test(layout, event, tokens, '%d{ABSOLUTE}', '14:18:30.045');
test(layout, event, tokens, '%d{DATE}', '05 12 2010 14:18:30.045');
test(layout, event, tokens, '%d{yy MM dd hh mm ss}', '10 12 05 14 18 30');
test(layout, event, tokens, '%d{yyyy MM dd}', '2010 12 05');
test(layout, event, tokens, '%d{yyyy MM dd hh mm ss SSS}', '2010 12 05 14 18 30 045');
});
it('%% should output %', function() {
test(layout, event, tokens, '%%', '%');
});
it('should output anything not preceded by % as literal', function() {
test(layout, event, tokens, 'blah blah blah', 'blah blah blah');
});
it('should output the original string if no replacer matches the token', function() {
test(layout, event, tokens, '%a{3}', 'a{3}');
});
it('should handle complicated patterns', function() {
test(layout, event, tokens,
'%m%n %c{2} at %d{ABSOLUTE} cheese %p%n',
'this is a test\n of.tests at 14:18:30.045 cheese DEBUG\n'
);
});
it('should truncate fields if specified', function() {
test(layout, event, tokens, '%.4m', 'this');
test(layout, event, tokens, '%.7m', 'this is');
test(layout, event, tokens, '%.9m', 'this is a');
test(layout, event, tokens, '%.14m', 'this is a test');
test(layout, event, tokens, '%.2919102m', 'this is a test');
});
it('should pad fields if specified', function() {
test(layout, event, tokens, '%10p', ' DEBUG');
test(layout, event, tokens, '%8p', ' DEBUG');
test(layout, event, tokens, '%6p', ' DEBUG');
test(layout, event, tokens, '%4p', 'DEBUG');
test(layout, event, tokens, '%-4p', 'DEBUG');
test(layout, event, tokens, '%-6p', 'DEBUG ');
test(layout, event, tokens, '%-8p', 'DEBUG ');
test(layout, event, tokens, '%-10p', 'DEBUG ');
});
it('%[%r%] should output colored time', function() {
test(layout, event, tokens, '%[%r%]', '\x1B[36m14:18:30\x1B[39m');
});
describe('%x{}', function() {
it('%x{testString} should output the string stored in tokens', function() {
test(layout, event, tokens, '%x{testString}', 'testStringToken');
});
it('%x{testFunction} should output the result of the function stored in tokens', function() {
test(layout, event, tokens, '%x{testFunction}', 'testFunctionToken');
});
it('%x{doesNotExist} should output the string stored in tokens', function() {
test(layout, event, tokens, '%x{doesNotExist}', '%x{doesNotExist}');
});
it('%x{fnThatUsesLogEvent} should be able to use the logEvent', function() {
test(layout, event, tokens, '%x{fnThatUsesLogEvent}', 'DEBUG');
});
it('%x should output the string stored in tokens', function() {
test(layout, event, tokens, '%x', '%x');
});
});
});
describe('layout makers', function() {
var layouts = require('../lib/layouts');
it('should have a maker for each layout', function() {
assert.ok(layouts.layout("messagePassThrough"));
assert.ok(layouts.layout("basic"));
assert.ok(layouts.layout("colored"));
assert.ok(layouts.layout("coloured"));
assert.ok(layouts.layout("pattern"));
});
it('should return falsy if a layout does not exist', function() {
assert.ok(!layouts.layout("cheese"));
});
it('should pass config to layouts that need it', function() {
var layout = layouts.layout(
"pattern",
{
pattern: "%m"
}
);
assert.equal(layout({ data: [ "blah" ] }), "blah");
});
});
});

View File

@@ -1,201 +0,0 @@
var vows = require('vows'),
assert = require('assert');
//used for patternLayout tests.
function test(args, pattern, value) {
var layout = args[0]
, event = args[1];
assert.equal(layout(pattern)(event), value);
}
vows.describe('log4js layouts').addBatch({
'colouredLayout': {
topic: function() {
return require('../lib/layouts').colouredLayout;
},
'should apply level colour codes to output': function(layout) {
var output = layout({
data: ["nonsense"],
startTime: new Date(2010, 11, 5, 14, 18, 30, 45),
categoryName: "cheese",
level: {
toString: function() { return "ERROR"; }
}
});
assert.equal(output, '\033[31m[2010-12-05 14:18:30.045] [ERROR] cheese - \033[39mnonsense');
},
'should support the console.log format for the message': function(layout) {
var output = layout({
data: ["thing %d", 2],
startTime: new Date(2010, 11, 5, 14, 18, 30, 45),
categoryName: "cheese",
level: {
toString: function() { return "ERROR"; }
}
});
assert.equal(output, '\033[31m[2010-12-05 14:18:30.045] [ERROR] cheese - \033[39mthing 2');
}
},
'messagePassThroughLayout': {
topic: function() {
return require('../lib/layouts').messagePassThroughLayout;
},
'should take a logevent and output only the message' : function(layout) {
assert.equal(layout({
data: ["nonsense"],
startTime: new Date(2010, 11, 5, 14, 18, 30, 45),
categoryName: "cheese",
level: {
colour: "green",
toString: function() { return "ERROR"; }
}
}), "nonsense");
},
'should support the console.log format for the message' : function(layout) {
assert.equal(layout({
data: ["thing %d", 1]
, startTime: new Date(2010, 11, 5, 14, 18, 30, 45)
, categoryName: "cheese"
, level : {
colour: "green"
, toString: function() { return "ERROR"; }
}
}), "thing 1");
}
},
'basicLayout': {
topic: function() {
var layout = require('../lib/layouts').basicLayout,
event = {
data: ['this is a test'],
startTime: new Date(2010, 11, 5, 14, 18, 30, 45),
categoryName: "tests",
level: {
toString: function() { return "DEBUG"; }
}
};
return [layout, event];
},
'should take a logevent and output a formatted string': function(args) {
var layout = args[0], event = args[1];
assert.equal(layout(event), "[2010-12-05 14:18:30.045] [DEBUG] tests - this is a test");
},
'should output a stacktrace, message if the event has an error attached': function(args) {
var layout = args[0], event = args[1], output, lines,
error = new Error("Some made-up error"),
stack = error.stack.split(/\n/);
event.data = ['this is a test', error];
output = layout(event);
lines = output.split(/\n/);
assert.length(lines, stack.length+1);
assert.equal(lines[0], "[2010-12-05 14:18:30.045] [DEBUG] tests - this is a test");
assert.equal(lines[1], "Error: Some made-up error");
for (var i = 1; i < stack.length; i++) {
assert.equal(lines[i+1], stack[i]);
}
},
'should output any extra data in the log event as util.inspect strings': function(args) {
var layout = args[0], event = args[1], output, lines;
event.data = ['this is a test', {
name: 'Cheese',
message: 'Gorgonzola smells.'
}];
output = layout(event);
lines = output.split(/\n/);
assert.length(lines, 2);
assert.equal(lines[0], "[2010-12-05 14:18:30.045] [DEBUG] tests - this is a test");
assert.equal(lines[1], "{ name: 'Cheese', message: 'Gorgonzola smells.' }");
}
},
'patternLayout': {
topic: function() {
var event = {
data: ['this is a test'],
startTime: new Date(2010, 11, 5, 14, 18, 30, 45),
categoryName: "multiple.levels.of.tests",
level: {
toString: function() { return "DEBUG"; }
}
}, layout = require('../lib/layouts').patternLayout;
return [layout, event];
},
'should default to "time logLevel loggerName - message"': function(args) {
test(args, null, "14:18:30 DEBUG multiple.levels.of.tests - this is a test\n");
},
'%r should output time only': function(args) {
test(args, '%r', '14:18:30');
},
'%p should output the log level': function(args) {
test(args, '%p', 'DEBUG');
},
'%c should output the log category': function(args) {
test(args, '%c', 'multiple.levels.of.tests');
},
'%m should output the log data': function(args) {
test(args, '%m', 'this is a test');
},
'%n should output a new line': function(args) {
test(args, '%n', '\n');
},
'%c should handle category names like java-style package names': function(args) {
test(args, '%c{1}', 'tests');
test(args, '%c{2}', 'of.tests');
test(args, '%c{3}', 'levels.of.tests');
test(args, '%c{4}', 'multiple.levels.of.tests');
test(args, '%c{5}', 'multiple.levels.of.tests');
test(args, '%c{99}', 'multiple.levels.of.tests');
},
'%d should output the date in ISO8601 format': function(args) {
test(args, '%d', '2010-12-05 14:18:30.045');
},
'%d should allow for format specification': function(args) {
test(args, '%d{ISO8601}', '2010-12-05 14:18:30.045');
test(args, '%d{ABSOLUTE}', '14:18:30.045');
test(args, '%d{DATE}', '05 12 2010 14:18:30.045');
test(args, '%d{yyyy MM dd}', '2010 12 05');
test(args, '%d{yyyy MM dd hh mm ss SSS}', '2010 12 05 14 18 30 045');
},
'%% should output %': function(args) {
test(args, '%%', '%');
},
'should output anything not preceded by % as literal': function(args) {
test(args, 'blah blah blah', 'blah blah blah');
},
'should handle complicated patterns': function(args) {
test(args,
'%m%n %c{2} at %d{ABSOLUTE} cheese %p%n',
'this is a test\n of.tests at 14:18:30.045 cheese DEBUG\n'
);
},
'should truncate fields if specified': function(args) {
test(args, '%.4m', 'this');
test(args, '%.7m', 'this is');
test(args, '%.9m', 'this is a');
test(args, '%.14m', 'this is a test');
test(args, '%.2919102m', 'this is a test');
},
'should pad fields if specified': function(args) {
test(args, '%10p', ' DEBUG');
test(args, '%8p', ' DEBUG');
test(args, '%6p', ' DEBUG');
test(args, '%4p', 'DEBUG');
test(args, '%-4p', 'DEBUG');
test(args, '%-6p', 'DEBUG ');
test(args, '%-8p', 'DEBUG ');
test(args, '%-10p', 'DEBUG ');
}
}
}).export(module);

427
test/levels-test.js Normal file
View File

@@ -0,0 +1,427 @@
"use strict";
var assert = require('assert')
, should = require('should')
, levels = require('../lib/levels');
function assertThat(level) {
function assertForEach(val, test, otherLevels) {
otherLevels.forEach(function(other) {
test.call(level, other).should.eql(val);
});
}
return {
isLessThanOrEqualTo: function(levels) {
assertForEach(true, level.isLessThanOrEqualTo, levels);
},
isNotLessThanOrEqualTo: function(levels) {
assertForEach(false, level.isLessThanOrEqualTo, levels);
},
isGreaterThanOrEqualTo: function(levels) {
assertForEach(true, level.isGreaterThanOrEqualTo, levels);
},
isNotGreaterThanOrEqualTo: function(levels) {
assertForEach(false, level.isGreaterThanOrEqualTo, levels);
},
isEqualTo: function(levels) {
assertForEach(true, level.isEqualTo, levels);
},
isNotEqualTo: function(levels) {
assertForEach(false, level.isEqualTo, levels);
}
};
}
describe('../lib/levels', function() {
it('should define some levels', function() {
should.exist(levels.ALL);
should.exist(levels.TRACE);
should.exist(levels.DEBUG);
should.exist(levels.INFO);
should.exist(levels.WARN);
should.exist(levels.ERROR);
should.exist(levels.FATAL);
should.exist(levels.OFF);
});
describe('ALL', function() {
var all = levels.ALL;
it('should be less than the other levels', function() {
assertThat(all).isLessThanOrEqualTo(
[
levels.ALL,
levels.TRACE,
levels.DEBUG,
levels.INFO,
levels.WARN,
levels.ERROR,
levels.FATAL,
levels.OFF
]
);
});
it('should be greater than no levels', function() {
assertThat(all).isNotGreaterThanOrEqualTo(
[
levels.TRACE,
levels.DEBUG,
levels.INFO,
levels.WARN,
levels.ERROR,
levels.FATAL,
levels.OFF
]
);
});
it('should only be equal to ALL', function() {
assertThat(all).isEqualTo([levels.toLevel("ALL")]);
assertThat(all).isNotEqualTo(
[
levels.TRACE,
levels.DEBUG,
levels.INFO,
levels.WARN,
levels.ERROR,
levels.FATAL,
levels.OFF
]
);
});
});
describe('TRACE', function() {
var trace = levels.TRACE;
it('should be less than DEBUG', function() {
assertThat(trace).isLessThanOrEqualTo(
[
levels.DEBUG,
levels.INFO,
levels.WARN,
levels.ERROR,
levels.FATAL,
levels.OFF
]
);
assertThat(trace).isNotLessThanOrEqualTo([levels.ALL]);
});
it('should be greater than ALL', function() {
assertThat(trace).isGreaterThanOrEqualTo([levels.ALL, levels.TRACE]);
assertThat(trace).isNotGreaterThanOrEqualTo(
[
levels.DEBUG,
levels.INFO,
levels.WARN,
levels.ERROR,
levels.FATAL,
levels.OFF
]
);
});
it('should only be equal to TRACE', function() {
assertThat(trace).isEqualTo([levels.toLevel("TRACE")]);
assertThat(trace).isNotEqualTo(
[
levels.ALL,
levels.DEBUG,
levels.INFO,
levels.WARN,
levels.ERROR,
levels.FATAL,
levels.OFF
]
);
});
});
describe('DEBUG', function() {
var debug = levels.DEBUG;
it('should be less than INFO', function() {
assertThat(debug).isLessThanOrEqualTo(
[
levels.INFO,
levels.WARN,
levels.ERROR,
levels.FATAL,
levels.OFF
]
);
assertThat(debug).isNotLessThanOrEqualTo([levels.ALL, levels.TRACE]);
});
it('should be greater than TRACE', function() {
assertThat(debug).isGreaterThanOrEqualTo([levels.ALL, levels.TRACE]);
assertThat(debug).isNotGreaterThanOrEqualTo(
[
levels.INFO,
levels.WARN,
levels.ERROR,
levels.FATAL,
levels.OFF
]
);
});
it('should only be equal to DEBUG', function() {
assertThat(debug).isEqualTo([levels.toLevel("DEBUG")]);
assertThat(debug).isNotEqualTo(
[
levels.ALL,
levels.TRACE,
levels.INFO,
levels.WARN,
levels.ERROR,
levels.FATAL,
levels.OFF
]
);
});
});
describe('INFO', function() {
var info = levels.INFO;
it('should be less than WARN', function() {
assertThat(info).isLessThanOrEqualTo([
levels.WARN,
levels.ERROR,
levels.FATAL,
levels.OFF
]);
assertThat(info).isNotLessThanOrEqualTo([levels.ALL, levels.TRACE, levels.DEBUG]);
});
it('should be greater than DEBUG', function() {
assertThat(info).isGreaterThanOrEqualTo([levels.ALL, levels.TRACE, levels.DEBUG]);
assertThat(info).isNotGreaterThanOrEqualTo([
levels.WARN,
levels.ERROR,
levels.FATAL,
levels.OFF
]);
});
it('should only be equal to INFO', function() {
assertThat(info).isEqualTo([levels.toLevel("INFO")]);
assertThat(info).isNotEqualTo([
levels.ALL,
levels.TRACE,
levels.DEBUG,
levels.WARN,
levels.ERROR,
levels.FATAL,
levels.OFF
]);
});
});
describe('WARN', function() {
var warn = levels.WARN;
it('should be less than ERROR', function() {
assertThat(warn).isLessThanOrEqualTo([levels.ERROR, levels.FATAL, levels.OFF]);
assertThat(warn).isNotLessThanOrEqualTo([
levels.ALL,
levels.TRACE,
levels.DEBUG,
levels.INFO
]);
});
it('should be greater than INFO', function() {
assertThat(warn).isGreaterThanOrEqualTo([
levels.ALL,
levels.TRACE,
levels.DEBUG,
levels.INFO
]);
assertThat(warn).isNotGreaterThanOrEqualTo([levels.ERROR, levels.FATAL, levels.OFF]);
});
it('should only be equal to WARN', function() {
assertThat(warn).isEqualTo([levels.toLevel("WARN")]);
assertThat(warn).isNotEqualTo([
levels.ALL,
levels.TRACE,
levels.DEBUG,
levels.INFO,
levels.ERROR,
levels.FATAL,
levels.OFF
]);
});
});
describe('ERROR', function() {
var error = levels.ERROR;
it('should be less than FATAL', function() {
assertThat(error).isLessThanOrEqualTo([levels.FATAL, levels.OFF]);
assertThat(error).isNotLessThanOrEqualTo([
levels.ALL,
levels.TRACE,
levels.DEBUG,
levels.INFO,
levels.WARN
]);
});
it('should be greater than WARN', function() {
assertThat(error).isGreaterThanOrEqualTo([
levels.ALL,
levels.TRACE,
levels.DEBUG,
levels.INFO,
levels.WARN
]);
assertThat(error).isNotGreaterThanOrEqualTo([levels.FATAL, levels.OFF]);
});
it('should only be equal to ERROR', function() {
assertThat(error).isEqualTo([levels.toLevel("ERROR")]);
assertThat(error).isNotEqualTo([
levels.ALL,
levels.TRACE,
levels.DEBUG,
levels.INFO,
levels.WARN,
levels.FATAL,
levels.OFF
]);
});
});
describe('FATAL', function() {
var fatal = levels.FATAL;
it('should be less than OFF', function() {
assertThat(fatal).isLessThanOrEqualTo([levels.OFF]);
assertThat(fatal).isNotLessThanOrEqualTo([
levels.ALL,
levels.TRACE,
levels.DEBUG,
levels.INFO,
levels.WARN,
levels.ERROR
]);
});
it('should be greater than ERROR', function() {
assertThat(fatal).isGreaterThanOrEqualTo([
levels.ALL,
levels.TRACE,
levels.DEBUG,
levels.INFO,
levels.WARN,
levels.ERROR
]);
assertThat(fatal).isNotGreaterThanOrEqualTo([levels.OFF]);
});
it('should only be equal to FATAL', function() {
assertThat(fatal).isEqualTo([levels.toLevel("FATAL")]);
assertThat(fatal).isNotEqualTo([
levels.ALL,
levels.TRACE,
levels.DEBUG,
levels.INFO,
levels.WARN,
levels.ERROR,
levels.OFF
]);
});
});
describe('OFF', function() {
var off = levels.OFF;
it('should not be less than anything', function() {
assertThat(off).isNotLessThanOrEqualTo([
levels.ALL,
levels.TRACE,
levels.DEBUG,
levels.INFO,
levels.WARN,
levels.ERROR,
levels.FATAL
]);
});
it('should be greater than everything', function() {
assertThat(off).isGreaterThanOrEqualTo([
levels.ALL,
levels.TRACE,
levels.DEBUG,
levels.INFO,
levels.WARN,
levels.ERROR,
levels.FATAL
]);
});
it('should only be equal to OFF', function() {
assertThat(off).isEqualTo([levels.toLevel("OFF")]);
assertThat(off).isNotEqualTo([
levels.ALL,
levels.TRACE,
levels.DEBUG,
levels.INFO,
levels.WARN,
levels.ERROR,
levels.FATAL
]);
});
});
describe('isGreaterThanOrEqualTo', function() {
var info = levels.INFO;
it('should handle string arguments', function() {
assertThat(info).isGreaterThanOrEqualTo(["all", "trace", "debug"]);
assertThat(info).isNotGreaterThanOrEqualTo(['warn', 'ERROR', 'Fatal', 'off']);
});
});
describe('isLessThanOrEqualTo', function() {
var info = levels.INFO;
it('should handle string arguments', function() {
assertThat(info).isNotLessThanOrEqualTo(["all", "trace", "debug"]);
assertThat(info).isLessThanOrEqualTo(['warn', 'ERROR', 'Fatal', 'off']);
});
});
describe('isEqualTo', function() {
var info = levels.INFO;
it('should handle string arguments', function() {
assertThat(info).isEqualTo(["info", "INFO", "iNfO"]);
});
});
describe('toLevel', function() {
it('should ignore the case of arguments', function() {
levels.toLevel("debug").should.eql(levels.DEBUG);
levels.toLevel("DEBUG").should.eql(levels.DEBUG);
levels.toLevel("DeBuG").should.eql(levels.DEBUG);
});
it('should return undefined when argument is not recognised', function() {
should.not.exist(levels.toLevel("cheese"));
});
it('should return the default value if argument is not recognised', function() {
levels.toLevel("cheese", levels.DEBUG).should.eql(levels.DEBUG);
});
it('should return the default value if argument is falsy', function() {
levels.toLevel(undefined, levels.DEBUG).should.eql(levels.DEBUG);
levels.toLevel(null, levels.DEBUG).should.eql(levels.DEBUG);
});
});
});

423
test/log4js-test.js Normal file
View File

@@ -0,0 +1,423 @@
"use strict";
var should = require('should')
, fs = require('fs')
, sandbox = require('sandboxed-module')
, log4js = require('../lib/log4js');
describe('../lib/log4js', function() {
describe('#getLogger', function() {
it('should return a Logger', function() {
log4js.getLogger().should.have.property('debug').be.a('function');
log4js.getLogger().should.have.property('info').be.a('function');
log4js.getLogger().should.have.property('error').be.a('function');
});
});
describe('#configure', function() {
it('should require an object or a filename', function() {
[
undefined,
null,
true,
42,
function() {}
].forEach(function(arg) {
(function() { log4js.configure(arg); }).should.throw(
"You must specify configuration as an object or a filename."
);
});
});
it('should complain if the file cannot be found', function() {
(function() { log4js.configure("pants"); }).should.throw(
"ENOENT, no such file or directory 'pants'"
);
});
it('should pick up the configuration filename from env.LOG4JS_CONFIG', function() {
process.env.LOG4JS_CONFIG = 'made-up-file';
(function() { log4js.configure(); }).should.throw(
"ENOENT, no such file or directory 'made-up-file'"
);
delete process.env.LOG4JS_CONFIG;
});
it('should complain if the config does not specify any appenders', function() {
(function() { log4js.configure({}); }).should.throw(
"You must specify at least one appender."
);
(function() { log4js.configure({ appenders: {} }); }).should.throw(
"You must specify at least one appender."
);
});
it(
'should complain if the config does not specify an appender for the default category',
function() {
(function() {
log4js.configure(
{
appenders: {
"console": { type: "console" }
},
categories: {}
}
);
}).should.throw(
"You must specify an appender for the default category"
);
(function() {
log4js.configure({
appenders: {
"console": { type: "console" }
},
categories: {
"cheese": { level: "DEBUG", appenders: [ "console" ] }
}
});
}).should.throw(
"You must specify an appender for the default category"
);
}
);
it('should complain if a category does not specify level or appenders', function() {
(function() {
log4js.configure(
{
appenders: { "console": { type: "console" } },
categories: {
"default": { thing: "thing" }
}
}
);
}).should.throw(
"You must specify a level for category 'default'."
);
(function() {
log4js.configure(
{
appenders: { "console": { type: "console" } },
categories: {
"default": { level: "DEBUG" }
}
}
);
}).should.throw(
"You must specify an appender for category 'default'."
);
});
it('should complain if a category specifies a level that does not exist', function() {
(function() {
log4js.configure(
{
appenders: { "console": { type: "console" }},
categories: {
"default": { level: "PICKLES" }
}
}
);
}).should.throw(
"Level 'PICKLES' is not valid for category 'default'. " +
"Acceptable values are: OFF, TRACE, DEBUG, INFO, WARN, ERROR, FATAL."
);
});
it('should complain if a category specifies an appender that does not exist', function() {
(function() {
log4js.configure(
{
appenders: { "console": { type: "console" }},
categories: {
"default": { level: "DEBUG", appenders: [ "cheese" ] }
}
}
);
}).should.throw(
"Appender 'cheese' for category 'default' does not exist. Known appenders are: console."
);
});
before(function(done) {
fs.unlink("test.log", function (err) { done(); });
});
it('should set up the included appenders', function(done) {
log4js.configure({
appenders: {
"file": { type: "file", filename: "test.log" }
},
categories: {
default: { level: "DEBUG", appenders: [ "file" ] }
}
});
log4js.getLogger('test').debug("cheese");
setTimeout(function() {
fs.readFile("test.log", "utf-8", function(err, contents) {
contents.should.include("cheese");
done(err);
});
}, 50);
});
after(function(done) {
fs.unlink("test.log", function (err) { done(); });
});
it('should set up third-party appenders', function() {
var events = [], log4js_sandbox = sandbox.require(
'../lib/log4js',
{
requires: {
'cheese': function() {
return function() {
return function(evt) { events.push(evt); };
};
}
}
}
);
log4js_sandbox.configure({
appenders: {
"thing": { type: "cheese" }
},
categories: {
default: { level: "DEBUG", appenders: [ "thing" ] }
}
});
log4js_sandbox.getLogger().info("edam");
events.should.have.length(1);
events[0].data[0].should.eql("edam");
});
it('should only load third-party appenders once', function() {
var moduleCalled = 0
, log4js_sandbox = sandbox.require(
'../lib/log4js',
{
requires: {
'cheese': function() {
moduleCalled += 1;
return function() {
return function() {};
};
}
}
}
);
log4js_sandbox.configure({
appenders: {
"thing1": { type: "cheese" },
"thing2": { type: "cheese" }
},
categories: {
default: { level: "DEBUG", appenders: [ "thing1", "thing2" ] }
}
});
moduleCalled.should.eql(1);
});
it('should pass layouts and levels to appender modules', function() {
var layouts
, levels
, log4js_sandbox = sandbox.require(
'../lib/log4js',
{
requires: {
'cheese': function(arg1, arg2) {
layouts = arg1;
levels = arg2;
return function() {
return function() {};
};
}
}
}
);
log4js_sandbox.configure({
appenders: {
"thing": { type: "cheese" }
},
categories: {
"default": { level: "debug", appenders: [ "thing" ] }
}
});
layouts.should.have.property("basicLayout");
levels.should.have.property("toLevel");
});
it('should pass config and appenderByName to appender makers', function() {
var otherAppender = function() { /* I do nothing */ }
, config
, other
, log4js_sandbox = sandbox.require(
'../lib/log4js',
{
requires: {
'other': function() {
return function() {
return otherAppender;
};
},
'cheese': function() {
return function(arg1, arg2) {
config = arg1;
other = arg2("other");
return function() {};
};
}
}
}
);
log4js_sandbox.configure({
appenders: {
"other": { type: "other" },
"thing": { type: "cheese", something: "something" }
},
categories: {
default: { level: "debug", appenders: [ "other", "thing" ] }
}
});
other.should.equal(otherAppender);
config.should.have.property("something", "something");
});
it('should complain about unknown appenders', function() {
(function() {
log4js.configure({
appenders: {
"thing": { type: "madeupappender" }
},
categories: {
default: { level: "DEBUG", appenders: [ "thing" ] }
}
});
}).should.throw(
"Could not load appender of type 'madeupappender'."
);
});
it('should read config from a file', function() {
var events = [], log4js_sandbox = sandbox.require(
'../lib/log4js',
{
requires: {
'cheese': function() {
return function() {
return function(event) { events.push(event); };
};
}
}
}
);
log4js_sandbox.configure(__dirname + "/with-cheese.json");
log4js_sandbox.getLogger().debug("gouda");
events.should.have.length(1);
events[0].data[0].should.eql("gouda");
});
it('should set up log levels for categories', function() {
var events = []
, noisyLogger
, log4js_sandbox = sandbox.require(
'../lib/log4js',
{
requires: {
'cheese': function() {
return function() {
return function(event) { events.push(event); };
};
}
}
}
);
log4js_sandbox.configure(__dirname + "/with-cheese.json");
noisyLogger = log4js_sandbox.getLogger("noisy");
noisyLogger.debug("pow");
noisyLogger.info("crash");
noisyLogger.warn("bang");
noisyLogger.error("boom");
noisyLogger.fatal("aargh");
events.should.have.length(2);
events[0].data[0].should.eql("boom");
events[1].data[0].should.eql("aargh");
});
it('should have a default log level for all categories', function() {
var events = []
, log4js_sandbox = sandbox.require(
'../lib/log4js',
{
requires: {
'cheese': function() {
return function() {
return function(event) { events.push(event); };
};
}
}
}
);
//with-cheese.json only specifies categories noisy and default
//unspecified categories should use the default category config
log4js_sandbox.configure(__dirname + "/with-cheese.json");
log4js_sandbox.getLogger("surprise").trace("not seen");
log4js_sandbox.getLogger("surprise").info("should be seen");
events.should.have.length(1);
events[0].data[0].should.eql("should be seen");
});
});
describe('with no configuration', function() {
var events = []
, log4js_sandboxed = sandbox.require(
'../lib/log4js',
{
requires: {
'./appenders/console': function() {
return function() {
return function(event) { events.push(event); };
};
}
}
}
);
log4js_sandboxed.getLogger("blah").debug("goes to console");
log4js_sandboxed.getLogger("yawn").trace("does not go to console");
log4js_sandboxed.getLogger().error("also goes to console");
it('should log events of debug level and higher to console', function() {
events.should.have.length(2);
events[0].data[0].should.eql("goes to console");
events[0].category.should.eql("blah");
events[0].level.toString().should.eql("DEBUG");
events[1].data[0].should.eql("also goes to console");
events[1].category.should.eql("default");
events[1].level.toString().should.eql("ERROR");
});
});
});

141
test/logLevelFilter-test.js Normal file
View File

@@ -0,0 +1,141 @@
"use strict";
var should = require('should')
, sandbox = require('sandboxed-module')
, log4js = require('../lib/log4js');
describe('log level filter', function() {
describe('when configured correctly', function() {
var events = [], logger;
before(function() {
var log4js_sandboxed = sandbox.require(
'../lib/log4js',
{ requires:
{ './appenders/console': function() {
return function() {
return function(evt) { events.push(evt); };
};
}
}
}
);
log4js_sandboxed.configure({
appenders: {
"console": { type: "console", layout: { type: "messagePassThrough" } },
"errors only": {
type: "logLevelFilter",
allow: [ "ERROR", "FATAL" ],
appender: "console"
}
},
categories: {
default: { level: "DEBUG", appenders: [ "errors only" ] }
}
});
logger = log4js_sandboxed.getLogger("test");
});
it('should pass events to an appender if they match', function() {
logger.error("oh no");
logger.fatal("boom");
events.should.have.length(2);
events[0].data[0].should.eql("oh no");
events[1].data[0].should.eql("boom");
});
it('should not pass events to the appender if they do not match', function() {
events.should.have.length(2);
logger.debug("cheese");
events.should.have.length(2);
logger.info("yawn");
events.should.have.length(2);
});
});
it('should complain if it has no appender', function() {
(function() {
log4js.configure({
appenders: {
"errors": {
type: "logLevelFilter",
allow: [ "ERROR", "FATAL" ]
}
},
categories: {
default: { level: "DEBUG", appenders: [ "errors" ] }
}
});
}).should.throw(/Missing an appender\./);
});
it('should complain if it has no list of allowed levels', function() {
(function() {
log4js.configure({
appenders: {
"console": { type: "console" },
"errors": {
type: "logLevelFilter",
appender: "console"
}
},
categories: {
default: { level: "DEBUG", appenders: [ "errors" ] }
}
});
}).should.throw(/No allowed log levels specified\./);
});
it('should complain if the referenced appender does not exist', function() {
(function() {
log4js.configure({
appenders: {
"errors": {
type: "logLevelFilter",
allow: [ "ERROR" ],
appender: "console"
}
},
categories: {
default: { level: "DEBUG", appenders: [ "errors" ] }
}
});
}).should.throw(/Appender 'console' not found\./);
});
it('should complain if the list of levels is not valid', function() {
(function() {
log4js.configure({
appenders: {
"errors": {
type: "logLevelFilter",
allow: [ "cheese", "biscuits", "ERROR" ],
appender: "console"
}
},
categories: {
default: { level: "DEBUG", appenders: [ "errors" ] }
}
});
}).should.throw(/Unrecognised log level 'cheese'\./);
});
it('should complain if the list of levels is empty', function() {
(function() {
log4js.configure({
appenders: {
"console": { type: "console" },
"errors": {
type: "logLevelFilter",
allow: [],
appender: "console"
}
},
categories: {
default: { level: "debug", appenders: [ "errors" ] }
}
});
}).should.throw(/No allowed log levels specified\./);
});
});

53
test/logger-test.js Normal file
View File

@@ -0,0 +1,53 @@
"use strict";
var should = require('should')
, levels = require('../lib/levels')
, Logger = require('../lib/logger');
describe('../lib/logger', function() {
describe('Logger constructor', function() {
it('must be passed a dispatch delegate and a category', function() {
(function() { new Logger(); }).should.throw(
"Logger must have a dispatch delegate."
);
(function() { new Logger(function() {}); }).should.throw(
"Logger must have a category."
);
});
});
describe('Logger instance', function() {
var event
, logger = new Logger(
function(evt) { event = evt; },
"exciting category"
);
beforeEach(function() {
event = null;
});
it('should be immutable', function() {
logger.category = "rubbish";
logger.debug("thing");
event.category.should.equal("exciting category");
});
['trace', 'debug', 'info', 'warn', 'error', 'fatal'].forEach(function(level) {
it('should have a ' + level + ' function', function() {
logger[level].should.be.a('function');
});
});
it('should send log events to the dispatch delegate', function() {
logger.debug("interesting thing");
event.should.have.property('category').equal('exciting category');
event.should.have.property('level').equal(levels.DEBUG);
event.should.have.property('data').eql(["interesting thing"]);
event.should.have.property('startTime');
});
});
});

View File

@@ -1,558 +0,0 @@
var vows = require('vows')
, assert = require('assert')
, sandbox = require('sandboxed-module');
vows.describe('log4js').addBatch({
'getLogger': {
topic: function() {
var log4js = require('../lib/log4js');
log4js.clearAppenders();
var logger = log4js.getLogger('tests');
logger.setLevel("DEBUG");
return logger;
},
'should take a category and return a logger': function(logger) {
assert.equal(logger.category, 'tests');
assert.equal(logger.level.toString(), "DEBUG");
assert.isFunction(logger.debug);
assert.isFunction(logger.info);
assert.isFunction(logger.warn);
assert.isFunction(logger.error);
assert.isFunction(logger.fatal);
},
'log events' : {
topic: function(logger) {
var events = [];
logger.addListener("log", function (logEvent) { events.push(logEvent); });
logger.debug("Debug event");
logger.trace("Trace event 1");
logger.trace("Trace event 2");
logger.warn("Warning event");
logger.error("Aargh!", new Error("Pants are on fire!"));
logger.error("Simulated CouchDB problem", { err: 127, cause: "incendiary underwear" });
return events;
},
'should emit log events': function(events) {
assert.equal(events[0].level.toString(), 'DEBUG');
assert.equal(events[0].data[0], 'Debug event');
assert.instanceOf(events[0].startTime, Date);
},
'should not emit events of a lower level': function(events) {
assert.length(events, 4);
assert.equal(events[1].level.toString(), 'WARN');
},
'should include the error if passed in': function (events) {
assert.instanceOf(events[2].data[1], Error);
assert.equal(events[2].data[1].message, 'Pants are on fire!');
}
},
},
'fileAppender': {
topic: function() {
var appender
, logmessages = []
, thing = "thing"
, fakeFS = {
createWriteStream: function() {
assert.equal(arguments[0], './tmp-tests.log');
assert.isObject(arguments[1]);
assert.equal(arguments[1].flags, 'a');
assert.equal(arguments[1].mode, 0644);
assert.equal(arguments[1].encoding, 'utf8');
return {
write: function(message) {
logmessages.push(message);
}
, end: function() {}
, destroySoon: function() {}
};
},
watchFile: function() {
throw new Error("watchFile should not be called if logSize is not defined");
}
},
log4js = sandbox.require(
'../lib/log4js',
{
requires: {
'fs': fakeFS
}
}
);
log4js.clearAppenders();
appender = log4js.fileAppender('./tmp-tests.log', log4js.layouts.messagePassThroughLayout);
log4js.addAppender(appender, 'file-test');
var logger = log4js.getLogger('file-test');
logger.debug("this is a test");
return logmessages;
},
'should write log messages to file': function(logmessages) {
assert.length(logmessages, 1);
assert.equal(logmessages, "this is a test\n");
}
},
'fileAppender - with rolling based on size and number of files to keep': {
topic: function() {
var watchCb,
filesOpened = [],
filesEnded = [],
filesDestroyedSoon = [],
filesRenamed = [],
newFilenames = [],
existingFiles = ['tests.log'],
log4js = sandbox.require(
'../lib/log4js'
, {
requires: {
'fs': {
watchFile: function(file, options, callback) {
assert.equal(file, 'tests.log');
assert.equal(options.persistent, false);
assert.equal(options.interval, 30000);
assert.isFunction(callback);
watchCb = callback;
},
createWriteStream: function(file) {
assert.equal(file, 'tests.log');
filesOpened.push(file);
return {
end: function() {
filesEnded.push(file);
},
destroySoon: function() {
filesDestroyedSoon.push(file);
}
};
},
statSync: function(file) {
if (existingFiles.indexOf(file) < 0) {
throw new Error("this file doesn't exist");
} else {
return true;
}
},
renameSync: function(oldFile, newFile) {
filesRenamed.push(oldFile);
existingFiles.push(newFile);
}
}
}
}
);
var appender = log4js.fileAppender('tests.log', log4js.messagePassThroughLayout, 1024, 2, 30);
return [watchCb, filesOpened, filesEnded, filesDestroyedSoon, filesRenamed, existingFiles];
},
'should close current log file, rename all old ones, open new one on rollover': function(args) {
var watchCb = args[0]
, filesOpened = args[1]
, filesEnded = args[2]
, filesDestroyedSoon = args[3]
, filesRenamed = args[4]
, existingFiles = args[5];
assert.isFunction(watchCb);
//tell the watchCb that the file is below the threshold
watchCb({ size: 891 }, { size: 0 });
//filesOpened should still be the first one.
assert.length(filesOpened, 1);
//tell the watchCb that the file is now over the threshold
watchCb({ size: 1053 }, { size: 891 });
//it should have closed the first log file.
assert.length(filesEnded, 1);
assert.length(filesDestroyedSoon, 1);
//it should have renamed the previous log file
assert.length(filesRenamed, 1);
//and we should have two files now
assert.length(existingFiles, 2);
assert.deepEqual(existingFiles, ['tests.log', 'tests.log.1']);
//and opened a new log file.
assert.length(filesOpened, 2);
//now tell the watchCb that we've flipped over the threshold again
watchCb({ size: 1025 }, { size: 123 });
//it should have closed the old file
assert.length(filesEnded, 2);
assert.length(filesDestroyedSoon, 2);
//it should have renamed both the old log file, and the previous '.1' file
assert.length(filesRenamed, 3);
assert.deepEqual(filesRenamed, ['tests.log', 'tests.log.1', 'tests.log' ]);
//it should have renamed 2 more file
assert.length(existingFiles, 4);
assert.deepEqual(existingFiles, ['tests.log', 'tests.log.1', 'tests.log.2', 'tests.log.1']);
//and opened a new log file
assert.length(filesOpened, 3);
//tell the watchCb we've flipped again.
watchCb({ size: 1024 }, { size: 234 });
//close the old one again.
assert.length(filesEnded, 3);
assert.length(filesDestroyedSoon, 3);
//it should have renamed the old log file and the 2 backups, with the last one being overwritten.
assert.length(filesRenamed, 5);
assert.deepEqual(filesRenamed, ['tests.log', 'tests.log.1', 'tests.log', 'tests.log.1', 'tests.log' ]);
//it should have renamed 2 more files
assert.length(existingFiles, 6);
assert.deepEqual(existingFiles, ['tests.log', 'tests.log.1', 'tests.log.2', 'tests.log.1', 'tests.log.2', 'tests.log.1']);
//and opened a new log file
assert.length(filesOpened, 4);
}
},
'configure' : {
topic: function() {
var messages = {}, fakeFS = {
createWriteStream: function(file) {
return {
write: function(message) {
if (!messages.hasOwnProperty(file)) {
messages[file] = [];
}
messages[file].push(message);
}
, end: function() {}
, destroySoon: function() {}
};
},
readFileSync: function(file, encoding) {
return require('fs').readFileSync(file, encoding);
},
watchFile: function(file) {
messages.watchedFile = file;
}
},
log4js = sandbox.require(
'../lib/log4js'
, {
requires: {
'fs': fakeFS
}
}
);
return [ log4js, messages ];
},
'should load appender configuration from a json file': function(args) {
var log4js = args[0], messages = args[1];
delete messages['tmp-tests.log'];
log4js.clearAppenders();
//this config file defines one file appender (to ./tmp-tests.log)
//and sets the log level for "tests" to WARN
log4js.configure('test/log4js.json');
var logger = log4js.getLogger("tests");
logger.info('this should not be written to the file');
logger.warn('this should be written to the file');
assert.length(messages['tmp-tests.log'], 1);
assert.equal(messages['tmp-tests.log'][0], 'this should be written to the file\n');
},
'should handle logLevelFilter configuration': function(args) {
var log4js = args[0], messages = args[1];
delete messages['tmp-tests.log'];
delete messages['tmp-tests-warnings.log'];
log4js.clearAppenders();
log4js.configure('test/with-logLevelFilter.json');
var logger = log4js.getLogger("tests");
logger.info('main');
logger.error('both');
logger.warn('both');
logger.debug('main');
assert.length(messages['tmp-tests.log'], 4);
assert.length(messages['tmp-tests-warnings.log'], 2);
assert.deepEqual(messages['tmp-tests.log'], ['main\n','both\n','both\n','main\n']);
assert.deepEqual(messages['tmp-tests-warnings.log'], ['both\n','both\n']);
},
'should handle fileAppender with log rolling' : function(args) {
var log4js = args[0], messages = args[1];
delete messages['tmp-test.log'];
log4js.configure('test/with-log-rolling.json');
assert.equal(messages.watchedFile, 'tmp-test.log');
},
'should handle an object or a file name': function(args) {
var log4js = args[0],
messages = args[1],
config = {
"appenders": [
{
"type" : "file",
"filename" : "cheesy-wotsits.log",
"maxLogSize" : 1024,
"backups" : 3,
"pollInterval" : 15
}
]
};
delete messages['cheesy-wotsits.log'];
log4js.configure(config);
assert.equal(messages.watchedFile, 'cheesy-wotsits.log');
}
},
'with no appenders defined' : {
topic: function() {
var logger
, message
, log4js = sandbox.require(
'../lib/log4js'
, {
globals: {
console: {
log: function(msg) {
message = msg;
}
}
}
}
);
logger = log4js.getLogger("some-logger");
logger.debug("This is a test");
return message;
},
'should default to the console appender': function(message) {
assert.isTrue(/This is a test$/.test(message));
}
},
'addAppender' : {
topic: function() {
var log4js = require('../lib/log4js');
log4js.clearAppenders();
return log4js;
},
'without a category': {
'should register the function as a listener for all loggers': function (log4js) {
var appenderEvent, appender = function(evt) { appenderEvent = evt; }, logger = log4js.getLogger("tests");
log4js.addAppender(appender);
logger.debug("This is a test");
assert.equal(appenderEvent.data[0], "This is a test");
assert.equal(appenderEvent.categoryName, "tests");
assert.equal(appenderEvent.level.toString(), "DEBUG");
},
'should also register as an appender for loggers if an appender for that category is defined': function (log4js) {
var otherEvent, appenderEvent, cheeseLogger;
log4js.addAppender(function (evt) { appenderEvent = evt; });
log4js.addAppender(function (evt) { otherEvent = evt; }, 'cheese');
cheeseLogger = log4js.getLogger('cheese');
cheeseLogger.debug('This is a test');
assert.deepEqual(appenderEvent, otherEvent);
assert.equal(otherEvent.data[0], 'This is a test');
assert.equal(otherEvent.categoryName, 'cheese');
otherEvent = undefined;
appenderEvent = undefined;
log4js.getLogger('pants').debug("this should not be propagated to otherEvent");
assert.isUndefined(otherEvent);
assert.equal(appenderEvent.data[0], "this should not be propagated to otherEvent");
}
},
'with a category': {
'should only register the function as a listener for that category': function(log4js) {
var appenderEvent, appender = function(evt) { appenderEvent = evt; }, logger = log4js.getLogger("tests");
log4js.addAppender(appender, 'tests');
logger.debug('this is a category test');
assert.equal(appenderEvent.data[0], 'this is a category test');
appenderEvent = undefined;
log4js.getLogger('some other category').debug('Cheese');
assert.isUndefined(appenderEvent);
}
},
'with multiple categories': {
'should register the function as a listener for all the categories': function(log4js) {
var appenderEvent, appender = function(evt) { appenderEvent = evt; }, logger = log4js.getLogger('tests');
log4js.addAppender(appender, 'tests', 'biscuits');
logger.debug('this is a test');
assert.equal(appenderEvent.data[0], 'this is a test');
appenderEvent = undefined;
var otherLogger = log4js.getLogger('biscuits');
otherLogger.debug("mmm... garibaldis");
assert.equal(appenderEvent.data[0], "mmm... garibaldis");
appenderEvent = undefined;
log4js.getLogger("something else").debug("pants");
assert.isUndefined(appenderEvent);
},
'should register the function when the list of categories is an array': function(log4js) {
var appenderEvent, appender = function(evt) { appenderEvent = evt; };
log4js.addAppender(appender, ['tests', 'pants']);
log4js.getLogger('tests').debug('this is a test');
assert.equal(appenderEvent.data[0], 'this is a test');
appenderEvent = undefined;
log4js.getLogger('pants').debug("big pants");
assert.equal(appenderEvent.data[0], "big pants");
appenderEvent = undefined;
log4js.getLogger("something else").debug("pants");
assert.isUndefined(appenderEvent);
}
}
},
'default setup': {
topic: function() {
var pathsChecked = [],
message,
logger,
modulePath = require('path').normalize(__dirname + '/../lib/log4js.json'),
fakeFS = {
readFileSync: function (file, encoding) {
assert.equal(file, modulePath);
assert.equal(encoding, 'utf8');
return '{ "appenders" : [ { "type": "console", "layout": { "type": "messagePassThrough" }} ] }';
},
statSync: function (path) {
pathsChecked.push(path);
if (path === modulePath) {
return true;
} else {
throw new Error("no such file");
}
}
},
fakeConsole = {
log : function (msg) { message = msg; },
info: this.log,
warn: this.log,
debug: this.log,
error: this.log
},
log4js = sandbox.require(
'../lib/log4js',
{
requires: {
'fs': fakeFS
},
globals: {
'console': fakeConsole
}
}
);
logger = log4js.getLogger('a-test');
logger.debug("this is a test");
return [ pathsChecked, message, modulePath ];
},
'should check current directory, require paths, and finally the module dir for log4js.json': function(args) {
var pathsChecked = args[0];
expectedPaths = ['log4js.json'].concat(
require.paths.map(function(item) {
return item + '/log4js.json';
}),
args[2]
);
assert.deepEqual(pathsChecked, expectedPaths);
},
'should configure log4js from first log4js.json found': function(args) {
var message = args[1];
assert.equal(message, 'this is a test');
}
},
'logLevelFilter': {
topic: function() {
var log4js = require('../lib/log4js'), logEvents = [], logger;
log4js.clearAppenders();
log4js.addAppender(log4js.logLevelFilter('ERROR', function(evt) { logEvents.push(evt); }), "logLevelTest");
logger = log4js.getLogger("logLevelTest");
logger.debug('this should not trigger an event');
logger.warn('neither should this');
logger.error('this should, though');
logger.fatal('so should this');
return logEvents;
},
'should only pass log events greater than or equal to its own level' : function(logEvents) {
assert.length(logEvents, 2);
assert.equal(logEvents[0].data[0], 'this should, though');
assert.equal(logEvents[1].data[0], 'so should this');
}
},
'console' : {
topic: function() {
var fakeConsole = {}
, logEvents = []
, log4js;
['trace','debug','log','info','warn','error'].forEach(function(fn) {
fakeConsole[fn] = function() {
throw new Error("this should not be called.");
};
});
log4js = sandbox.require(
'../lib/log4js'
, {
globals: {
console: fakeConsole
}
}
);
log4js.clearAppenders();
log4js.addAppender(function(evt) {
logEvents.push(evt);
});
fakeConsole.log("Some debug message someone put in a module");
fakeConsole.debug("Some debug");
fakeConsole.error("An error");
fakeConsole.info("some info");
fakeConsole.warn("a warning");
fakeConsole.log("cheese (%s) and biscuits (%s)", "gouda", "garibaldis");
fakeConsole.log({ lumpy: "tapioca" });
fakeConsole.log("count %d", 123);
fakeConsole.log("stringify %j", { lumpy: "tapioca" });
return logEvents;
},
'should replace console.log methods with log4js ones': function(logEvents) {
assert.equal(logEvents[0].data[0], "Some debug message someone put in a module");
assert.equal(logEvents[0].level.toString(), "INFO");
assert.equal(logEvents[1].data[0], "Some debug");
assert.equal(logEvents[1].level.toString(), "DEBUG");
assert.equal(logEvents[2].data[0], "An error");
assert.equal(logEvents[2].level.toString(), "ERROR");
assert.equal(logEvents[3].data[0], "some info");
assert.equal(logEvents[3].level.toString(), "INFO");
assert.equal(logEvents[4].data[0], "a warning");
assert.equal(logEvents[4].level.toString(), "WARN");
}
},
'configuration persistence' : {
'should maintain appenders between requires': function () {
var logEvent, firstLog4js = require('../lib/log4js'), secondLog4js;
firstLog4js.clearAppenders();
firstLog4js.addAppender(function(evt) { logEvent = evt; });
secondLog4js = require('../lib/log4js');
secondLog4js.getLogger().info("This should go to the appender defined in firstLog4js");
assert.equal(logEvent.data[0], "This should go to the appender defined in firstLog4js");
}
}
}).export(module);

View File

@@ -1,128 +0,0 @@
var vows = require('vows')
, assert = require('assert')
, levels = require('../lib/levels');
function MockLogger() {
var that = this;
this.messages = [];
this.log = function(level, message, exception) {
that.messages.push({ level: level, message: message });
};
this.isLevelEnabled = function(level) {
return level.isGreaterThanOrEqualTo(that.level);
};
this.level = levels.TRACE;
}
function MockRequest(remoteAddr, method, originalUrl) {
this.socket = { remoteAddress: remoteAddr };
this.originalUrl = originalUrl;
this.method = method;
this.httpVersionMajor = '5';
this.httpVersionMinor = '0';
this.headers = {}
}
function MockResponse(statusCode) {
this.statusCode = statusCode;
this.end = function(chunk, encoding) {
}
}
vows.describe('log4js connect logger').addBatch({
'getConnectLoggerModule': {
topic: function() {
var clm = require('../lib/connect-logger');
return clm;
},
'should return a "connect logger" factory' : function(clm) {
assert.isObject(clm);
},
'take a log4js logger and return a "connect logger"' : {
topic: function(clm) {
var ml = new MockLogger();
var cl = clm.connectLogger(ml);
return cl;
},
'should return a "connect logger"': function(cl) {
assert.isFunction(cl);
}
},
'log events' : {
topic: function(clm) {
var ml = new MockLogger();
var cl = clm.connectLogger(ml);
var req = new MockRequest('my.remote.addr', 'GET', 'http://url');
var res = new MockResponse(200);
cl(req, res, function() { });
res.end('chunk', 'encoding');
return ml.messages;
},
'check message': function(messages) {
assert.isArray(messages);
assert.length(messages, 1);
assert.equal(messages[0].level, levels.INFO);
assert.include(messages[0].message, 'GET');
assert.include(messages[0].message, 'http://url');
assert.include(messages[0].message, 'my.remote.addr');
assert.include(messages[0].message, '200');
}
},
'log events with level below logging level' : {
topic: function(clm) {
var ml = new MockLogger();
ml.level = levels.FATAL;
var cl = clm.connectLogger(ml);
var req = new MockRequest('my.remote.addr', 'GET', 'http://url');
var res = new MockResponse(200);
cl(req, res, function() { });
res.end('chunk', 'encoding');
return ml.messages;
},
'check message': function(messages) {
assert.isArray(messages);
assert.isEmpty(messages);
}
},
'log events with non-default level and custom format' : {
topic: function(clm) {
var ml = new MockLogger();
ml.level = levels.INFO;
var cl = clm.connectLogger(ml, { level: levels.INFO, format: ':method :url' } );
var req = new MockRequest('my.remote.addr', 'GET', 'http://url');
var res = new MockResponse(200);
cl(req, res, function() { });
res.end('chunk', 'encoding');
return ml.messages;
},
'check message': function(messages) {
assert.isArray(messages);
assert.length(messages, 1);
assert.equal(messages[0].level, levels.INFO);
assert.equal(messages[0].message, 'GET http://url');
}
}
}
}).export(module);

View File

@@ -1,85 +0,0 @@
var vows = require('vows'),
assert = require('assert');
vows.describe('log4js global loglevel').addBatch({
'global loglevel' : {
topic: function() {
var log4js = require('../lib/log4js');
return log4js;
},
'set global loglevel on creation': function(log4js) {
var log1 = log4js.getLogger('log1');
var level = 'OFF';
if (log1.level.toString() == level) {
level = 'TRACE';
}
assert.notEqual(log1.level.toString(), level);
log4js.setGlobalLogLevel(level);
assert.equal(log1.level.toString(), level);
var log2 = log4js.getLogger('log2');
assert.equal(log2.level.toString(), level);
},
'global change loglevel': function(log4js) {
var log1 = log4js.getLogger('log1');
var log2 = log4js.getLogger('log2');
var level = 'OFF';
if (log1.level.toString() == level) {
level = 'TRACE';
}
assert.notEqual(log1.level.toString(), level);
log4js.setGlobalLogLevel(level);
assert.equal(log1.level.toString(), level);
assert.equal(log2.level.toString(), level);
},
'override loglevel': function(log4js) {
var log1 = log4js.getLogger('log1');
var log2 = log4js.getLogger('log2');
var level = 'OFF';
if (log1.level.toString() == level) {
level = 'TRACE';
}
assert.notEqual(log1.level.toString(), level);
var oldLevel = log1.level.toString();
assert.equal(log2.level.toString(), oldLevel);
log2.setLevel(level);
assert.equal(log1.level.toString(), oldLevel);
assert.equal(log2.level.toString(), level);
assert.notEqual(oldLevel, level);
log2.removeLevel();
assert.equal(log1.level.toString(), oldLevel);
assert.equal(log2.level.toString(), oldLevel);
},
'preload loglevel': function(log4js) {
var log1 = log4js.getLogger('log1');
var level = 'OFF';
if (log1.level.toString() == level) {
level = 'TRACE';
}
assert.notEqual(log1.level.toString(), level);
var oldLevel = log1.level.toString();
log4js.getLogger('log2').setLevel(level);
assert.equal(log1.level.toString(), oldLevel);
// get again same logger but as different variable
var log2 = log4js.getLogger('log2');
assert.equal(log2.level.toString(), level);
assert.notEqual(oldLevel, level);
log2.removeLevel();
assert.equal(log1.level.toString(), oldLevel);
assert.equal(log2.level.toString(), oldLevel);
}
}
}).export(module);

View File

@@ -0,0 +1,23 @@
{
"appenders": [
{
"type": "categoryFilter",
"exclude": "web",
"appender": {
"type": "file",
"filename": "test/categoryFilter-noweb.log",
"layout": {
"type": "messagePassThrough"
}
}
},
{
"category": "web",
"type": "file",
"filename": "test/categoryFilter-web.log",
"layout": {
"type": "messagePassThrough"
}
}
]
}

9
test/with-cheese.json Normal file
View File

@@ -0,0 +1,9 @@
{
"appenders": {
"thing": { "type": "cheese" }
},
"categories": {
"default": { "level": "DEBUG", "appenders": [ "thing" ] },
"noisy": { "level": "ERROR", "appenders": [ "thing" ] }
}
}

16
test/with-dateFile.json Normal file
View File

@@ -0,0 +1,16 @@
{
"appenders": {
"dateFile": {
"type": "dateFile",
"filename": "test/date-file-test.log",
"pattern": "-from-MM-dd",
"layout": {
"type": "messagePassThrough"
}
}
},
"categories": {
"default": { "level": "WARN", "appenders": [ "dateFile" ] }
}
}

View File

@@ -4,8 +4,7 @@
"type": "file",
"filename": "tmp-test.log",
"maxLogSize": 1024,
"backups": 3,
"pollInterval": 15
"backups": 3
}
]
}
}

View File

@@ -6,7 +6,7 @@
"level": "WARN",
"appender": {
"type": "file",
"filename": "tmp-tests-warnings.log",
"filename": "test/logLevelFilter-warnings.log",
"layout": {
"type": "messagePassThrough"
}
@@ -15,7 +15,7 @@
{
"category": "tests",
"type": "file",
"filename": "tmp-tests.log",
"filename": "test/logLevelFilter.log",
"layout": {
"type": "messagePassThrough"
}

81
writing-appenders.md Normal file
View File

@@ -0,0 +1,81 @@
Writing Appenders For log4js
============================
Loading appenders
-----------------
log4js supports loading appender modules from outside its own code. The [log4js-gelf](http://github.com/nomiddlename/log4js-gelf), [log4js-smtp](http://github.com/nomiddlename/log4js-smtp), and [log4js-hookio](http://github.com/nomiddlename/log4js-hookio) appenders are examples of this. In the configuration for an appender, log4js will first attempt to `require` the module from `./lib/appenders/ + type` within log4js - if that fails, it will `require` just using the type. e.g.
log4js.configure({
appenders: {
"custom": { type: "log4js-gelf", hostname: "blah", port: 1234 }
},
categories: {
"default": { level: "debug", appenders: ["custom"] }
}
});
log4js will first attempt to `require('./appenders/' + log4js-gelf)`, this will fail. It will then attempt `require('log4js-gelf')`, which (assuming you have previously run `npm install log4js-gelf`) will pick up the gelf appender.
Writing your own custom appender
--------------------------------
This is easiest to explain with an example. Let's assume you want to write a [CouchDB](http://couchdb.apache.org) appender. CouchDB is a document database that you talk to via HTTP and JSON. Our log4js configuration is going to look something like this:
log4js.configure({
appenders: {
"couch": {
type: "log4js-couchdb",
url: "http://mycouchhost:5984",
db: "logs",
layout: {
type: "messagePassThrough"
}
}
},
categories: {
"default": { level: "debug", appenders: ["couch"] }
}
});
When processing this configuration, the first thing log4js will do is `require('log4js-couchdb')`. It expects this module to return a function that accepts two arguments
module.exports = function(layouts, levels) {
...
};
log4js will then call that function, passing in the `layouts` and `levels` sub-modules in case your appender might need to use them. Layouts contains functions which will format a log event as a string in various different ways. Levels contains the definitions of the log levels used by log4js - you might need this for mapping log4js levels to external definitions (the GELF appender does this). These are passed in so that appenders do not need to include a hard dependency on log4js (see below), and so that log4js does not need to expose these modules to the public API. The module function will only be called once per call to `log4js.configure`, even if there are multiple appenders of that type defined.
The module function should return another function, a configuration function, which will be called for each appender of that type defined in the config. That function should return an appender instance. For our CouchDB example, the calling process is roughly like this:
couchDbModule = require('log4js-couchdb');
appenderMaker = couchDbModule(layouts, levels);
appender = appenderMaker({
type: "log4js-couchdb",
url: "http://mycouchhost:5984",
db: "logs",
layout: {
type: "messagePassThrough"
}
}, appenderByName)
Note that in addition to our couchdb appender config, the appenderMaker function gets an extra argument: `appenderByName`, a function which returns an appender when passed its name. This is used by appenders that wrap other appenders. The `logLevelFilter` is an example of this use.
The `layout` portion of the config can be passed directly to `layouts.layout(config.layout)` to generate a layout function.
The appender function returned after processing your config should just take one argument: a log event. This function will be called for every log event that should be handled by your appender. In our case, with the config above, every log event of DEBUG level and above will be sent to our appender.
Dependencies
------------
You should declare which version of log4js your appender works with by
including a "peerDependencies" section in your package.json. e.g.
{
"name": "my-cool-appender",
"version": "0.0.1",
...
"peerDependencies": {
"log4js": "0.7.x"
}
}
For more details on peer dependencies, see
[this blog post](http://blog.nodejs.org/2013/02/07/peer-dependencies/).