* installing to library ‘/home/hornik/tmp/R.check/r-devel-gcc/Work/build/Packages’ * installing *source* package ‘tokenizers’ ... ** package ‘tokenizers’ successfully unpacked and MD5 sums checked ** using staged installation ** libs using C++ compiler: ‘g++-13 (Debian 13.2.0-23) 13.2.0’ make[1]: Entering directory '/home/hornik/tmp/scratch/RtmpODGUY1/R.INSTALL13fc3937bc1eed/tokenizers/src' g++-13 -std=gnu++17 -I"/home/hornik/tmp/R.check/r-devel-gcc/Work/build/include" -DNDEBUG -I'/home/hornik/tmp/R.check/r-devel-gcc/Work/build/Packages/Rcpp/include' -I/usr/local/include -D_FORTIFY_SOURCE=3 -fpic -g -O2 -Wall -pedantic -mtune=native -c RcppExports.cpp -o RcppExports.o g++-13 -std=gnu++17 -I"/home/hornik/tmp/R.check/r-devel-gcc/Work/build/include" -DNDEBUG -I'/home/hornik/tmp/R.check/r-devel-gcc/Work/build/Packages/Rcpp/include' -I/usr/local/include -D_FORTIFY_SOURCE=3 -fpic -g -O2 -Wall -pedantic -mtune=native -c shingle_ngrams.cpp -o shingle_ngrams.o shingle_ngrams.cpp: In function ‘size_t get_ngram_seq_len(int, int, int)’: shingle_ngrams.cpp:8:36: warning: comparison of integer expressions of different signedness: ‘size_t’ {aka ‘long unsigned int’} and ‘int’ [-Wsign-compare] 8 | for (size_t i = ngram_min - 1; i < ngram_max; i++) | ~~^~~~~~~~~~~ shingle_ngrams.cpp: In function ‘Rcpp::CharacterVector generate_ngrams_internal(Rcpp::CharacterVector, int, int, std::set >&, std::deque >&, std::string)’: shingle_ngrams.cpp:28:24: warning: comparison of integer expressions of different signedness: ‘size_t’ {aka ‘long unsigned int’} and ‘R_xlen_t’ {aka ‘long int’} [-Wsign-compare] 28 | for (size_t i = 0; i < terms_raw.size(); i++) { | ~~^~~~~~~~~~~~~~~~~~ shingle_ngrams.cpp:45:23: warning: comparison of integer expressions of different signedness: ‘size_t’ {aka ‘long unsigned int’} and ‘int’ [-Wsign-compare] 45 | for(size_t j = 0; j < len; j++ ) { | ~~^~~~~ shingle_ngrams.cpp:48:14: warning: comparison of integer expressions of different signedness: ‘size_t’ {aka ‘long unsigned int’} and ‘const int’ [-Wsign-compare] 48 | while (k <= ngram_max && j_max_observed < len) { | ~~^~~~~~~~~~~~ shingle_ngrams.cpp:48:45: warning: comparison of integer expressions of different signedness: ‘size_t’ {aka ‘long unsigned int’} and ‘int’ [-Wsign-compare] 48 | while (k <= ngram_max && j_max_observed < len) { | ~~~~~~~~~~~~~~~^~~~~ shingle_ngrams.cpp:56:12: warning: comparison of integer expressions of different signedness: ‘size_t’ {aka ‘long unsigned int’} and ‘const int’ [-Wsign-compare] 56 | if(k >= ngram_min) { | ~~^~~~~~~~~~~~ shingle_ngrams.cpp: In function ‘Rcpp::ListOf > generate_ngrams_batch(Rcpp::ListOf >, int, int, Rcpp::CharacterVector, Rcpp::String)’: shingle_ngrams.cpp:84:23: warning: comparison of integer expressions of different signedness: ‘size_t’ {aka ‘long unsigned int’} and ‘R_xlen_t’ {aka ‘long int’} [-Wsign-compare] 84 | for(size_t i = 0; i < stopwords.size(); i++){ | ~~^~~~~~~~~~~~~~~~~~ g++-13 -std=gnu++17 -I"/home/hornik/tmp/R.check/r-devel-gcc/Work/build/include" -DNDEBUG -I'/home/hornik/tmp/R.check/r-devel-gcc/Work/build/Packages/Rcpp/include' -I/usr/local/include -D_FORTIFY_SOURCE=3 -fpic -g -O2 -Wall -pedantic -mtune=native -c skip_ngrams.cpp -o skip_ngrams.o g++-13 -std=gnu++17 -shared -L/home/hornik/tmp/R.check/r-devel-gcc/Work/build/lib -Wl,-O1 -o tokenizers.so RcppExports.o shingle_ngrams.o skip_ngrams.o -L/home/hornik/tmp/R.check/r-devel-gcc/Work/build/lib -lR make[1]: Leaving directory '/home/hornik/tmp/scratch/RtmpODGUY1/R.INSTALL13fc3937bc1eed/tokenizers/src' make[1]: Entering directory '/home/hornik/tmp/scratch/RtmpODGUY1/R.INSTALL13fc3937bc1eed/tokenizers/src' make[1]: Leaving directory '/home/hornik/tmp/scratch/RtmpODGUY1/R.INSTALL13fc3937bc1eed/tokenizers/src' installing to /home/hornik/tmp/R.check/r-devel-gcc/Work/build/Packages/00LOCK-tokenizers/00new/tokenizers/libs ** R ** data *** moving datasets to lazyload DB ** inst ** byte-compile and prepare package for lazy loading ** help *** installing help indices ** building package indices ** installing vignettes ** testing if installed package can be loaded from temporary location ** checking absolute paths in shared objects and dynamic libraries ** testing if installed package can be loaded from final location ** testing if installed package keeps a record of temporary installation path * creating tarball packaged installation of ‘tokenizers’ as ‘tokenizers_0.3.0_R_x86_64-pc-linux-gnu.tar.gz’ * DONE (tokenizers)