Class: Pocolog::Logfiles

Inherits:
Object
  • Object
show all
Defined in:
lib/pocolog/logfiles.rb,
lib/pocolog/convert.rb

Overview

Low-level access to (consistent) set of logfiles.

The pocolog logfiles can be split during recording in order to limit each file's size. This class allows to provide a list of files (to Logfiles.open) and have a uniform access to the data.

Files are indexed (i.e. a .idx file gets generated along with the log file) to provide quick random access.

A higher level access is provided by the streams (DataStream), that can be retrieved with #stream, #stream_from_type and #stream_from_index

Format

Pocolog files are made of:

  • a prologue

  • a sequence of generic blocks, each block pointing to the next block

  • blocks can either be stream blocks, control blocks or data blocks.

    • Stream blocks define a new data stream with name, type name and type definition, assigning it a stream ID which is unique in these logfiles

    • Control blocks provide additional, logfile-wide, information. They are not assigned to streams. This feature is currently unused.

    • Data blocks store a single data sample in a stream

See the tasks/logging.hh file in Rock's tools/logger package for a detailed description of each block's layout.

Defined Under Namespace

Classes: DataHeader

Constant Summary

MAGIC =
Format::Current::MAGIC
FORMAT_VERSION =
Format::Current::VERSION
BLOCK_HEADER_SIZE =
Format::Current::BLOCK_HEADER_SIZE
TIME_SIZE =
Format::Current::TIME_SIZE
MissingPrologue =

For backward compatibility

Pocolog::MissingPrologue
COMPRESSION_MIN_SIZE =

Data blocks of less than COMPRESSION_MIN_SIZE are never compressed

60 * 1024
COMPRESSION_THRESHOLD =

If the size gained by compressing is below this value, do not save in compressed form

0.3
TIME_PADDING =
TIME_SIZE - 8
DATA_BLOCK_HEADER_FORMAT =

Formatting string for Array.pack to create a data block

"VVx#{TIME_PADDING}VVx#{TIME_PADDING}VC"

Instance Attribute Summary collapse

Class Method Summary collapse

Instance Method Summary collapse

Constructor Details

#initialize(*io, write_only: false, index_dir: nil, silent: false) ⇒ Logfiles

call-seq:

Logfiles.open(io1, io2)
Logfiles.open(io1, io2, registry)

This is usually not used directly. Most users want to use Pocolog.open to read existing file(s), and Pocolog.create to create new ones.

Creates a new Logfiles object to read the given IO objects. If the last argument is a Typelib::Registry instance, update this registry with the type definitions found in the logfile.

Providing a type registry guarantees that you get an error if the logfile's types do not match the type definitions found in the registry.



100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
# File 'lib/pocolog/logfiles.rb', line 100

def initialize(*io, write_only: false, index_dir: nil, silent: false)
    @registry = io.pop if io.last.kind_of?(Typelib::Registry)

    @path = io.first.path unless io.empty?
    @io =
        if io.size == 1
            io.first
        else
            IOSequence.new(*io)
        end
    @block_stream = BlockStream.new(@io)
    @big_endian = block_stream.big_endian?

    @data = nil
    @streams = nil
    @compress = false
    @data_header_buffer = ''
    @streams =
        if write_only
            []
        else
            load_stream_info(io, index_dir: index_dir, silent: silent)
        end
end

Instance Attribute Details

#basenameObject

The basename for creating new log files. The files names are

#{basename}.#{index}.log


162
163
164
# File 'lib/pocolog/logfiles.rb', line 162

def basename
  @basename
end

#block_streamObject (readonly)

The block stream object used to interpret the data stream



84
85
86
# File 'lib/pocolog/logfiles.rb', line 84

def block_stream
  @block_stream
end

#ioObject (readonly)

The underlying IO object

Sequence of files are handled through the IOSequence facade



77
78
79
# File 'lib/pocolog/logfiles.rb', line 77

def io
  @io
end

#pathObject (readonly)

Returns the value of attribute path



125
126
127
# File 'lib/pocolog/logfiles.rb', line 125

def path
  @path
end

#registryObject (readonly)

The type registry for these logfiles, as a Typelib::Registry instance



79
80
81
# File 'lib/pocolog/logfiles.rb', line 79

def registry
  @registry
end

#streamsObject (readonly)

Set of data streams found in this file



81
82
83
# File 'lib/pocolog/logfiles.rb', line 81

def streams
  @streams
end

Class Method Details

.append(basename) ⇒ Object

Open an already existing set of log files or create it



219
220
221
222
223
224
225
226
227
228
229
230
231
232
# File 'lib/pocolog/logfiles.rb', line 219

def self.append(basename)
    io = []
    i = 0
    while File.readable?(path = "#{basename}.#{i}.log")
        io << File.open(path, 'a+')
        i += 1
    end

    return create(basename) if io.empty?

    file = Logfiles.new(*io)
    file.basename = basename
    file
end

.compress(from_io, to_io) ⇒ Object



77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
# File 'lib/pocolog/convert.rb', line 77

def self.compress(from_io, to_io)
    from = Logfiles.new(from_io)
    from.read_prologue
    write_prologue(to_io, from.endian_swap ^ Pocolog.big_endian?)

    compressed = [1].pack('C')
    buffer = "".encode('BINARY')
    from.each_block(true) do |block_info|
        if block_info.type != DATA_BLOCK || block_info.payload_size < COMPRESSION_MIN_SIZE
            copy_block(block_info, from_io, to_io, buffer)
        else
            # Get the data header
            data_header = from.data_header
            if data_header.compressed
                copy_block(block_info, from_io, to_io, buffer)
                next
            end

            compressed = Zlib::Deflate.deflate(from.data)
            delta = data_header.size - compressed.size
            if Float(delta) / data_header.size > COMPRESSION_THRESHOLD
                # Save it in compressed form
                payload_size = DATA_HEADER_SIZE + compressed.size
                to_io.write([block_info.type, block_info.index, payload_size].pack('CxvV'))
                from_io.seek(block_info.pos + BLOCK_HEADER_SIZE)
                from_io.read(TIME_SIZE * 2, buffer)
                to_io.write(buffer << [compressed.size, 1].pack('VC'))
                to_io.write(compressed)
            else
                copy_block(block_info, from_io, to_io, buffer)
            end
        end
    end
end

.copy_block(block_info, from_io, to_io, buffer) ⇒ Object



12
13
14
15
16
17
18
19
20
# File 'lib/pocolog/convert.rb', line 12

def self.copy_block(block_info, from_io, to_io, buffer)
    # copy the block as-is
    from_io.seek(block_info.pos)
    buffer = from_io.read(BLOCK_HEADER_SIZE + block_info.payload_size, buffer)
    if block_given?
        buffer = yield(buffer)
    end
    to_io.write(buffer)
end

.create(basename, registry = Typelib::Registry.new) ⇒ Object

Create an empty log file using basename to build its name. Namely, it will create a new file named <basename>.0.log. Then, calls to #new_file would create <basename>.1.log and so on



207
208
209
210
211
212
213
214
215
216
# File 'lib/pocolog/logfiles.rb', line 207

def self.create(basename, registry = Typelib::Registry.new)
    file = Logfiles.new(registry, write_only: true)
    if basename =~ /\.\d+\.log$/
        file.new_file(basename)
    else
        file.basename = basename
        file.new_file
    end
    file
end

.default_index_filename(path, index_dir: File.dirname(path)) ⇒ String

Returns the default index file for a given file

Parameters:

  • path (String)

    the log file's path

Returns:

  • (String)

    the index file name



246
247
248
249
250
251
252
253
254
255
# File 'lib/pocolog/logfiles.rb', line 246

def self.default_index_filename(path, index_dir: File.dirname(path))
    index_filename = File.basename(path).gsub(/\.log$/, '.idx')
    index_path = File.join(index_dir, index_filename)
    if index_path == path
        raise ArgumentError,
              "#{path} does not end in .log, cannot generate a "\
              'default index name for it'
    end
    index_path
end

.using(rawdata) ⇒ Object .using(objects) ⇒ Object

Produce the encoded (on-disk) version of a stream definition

Note that raw and object forms can be mixed, i.e. you may pass the type and registry as string, but the metadata as Hash

Overloads:

  • .using(rawdata) ⇒ Object

    Parameters:

    • name (String)

      the stream name

    • type_name (String)

      the name of the stream type

    • type_registry (String)

      the type registry in XML form

    • metadata (String)

      the metadata marshalled as YAML

  • .using(objects) ⇒ Object

    Parameters:

    • name (String)

      the stream name

    • type (Class<Typelib::Type>)

      the stream type

    • metadata (Hash)

      the metadata



500
501
502
503
504
505
506
507
508
509
510
511
512
513
514
515
516
517
518
519
520
521
# File 'lib/pocolog/logfiles.rb', line 500

def self.encode_stream_declaration_payload(
    name, type_name, type_registry: nil, metadata: {}
)
    if type_name.respond_to?(:name)
        type_registry ||= type_name.registry.minimal(type_name.name).to_xml
        type_name = type_name.name
    elsif !type_registry
        raise ArgumentError,
              "cannot give a type name without a corresponding type registry"
    elsif type_registry.respond_to?(:to_xml)
        type_registry = type_registry.to_xml
    end

    unless .respond_to?(:to_str)
         = ()
         = YAML.dump()
    end

    BlockStream::StreamBlock
        .new(name, type_name, type_registry, )
        .encode
end

.from_version_1(from, to_io, big_endian) ⇒ Object

Converts a version 1 logfile. Modifications:

  • no prologue

  • no compressed flag on data blocks

  • time was written as [type, sec, usec, padding], with each field a 32-bit integer



27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
# File 'lib/pocolog/convert.rb', line 27

def self.from_version_1(from, to_io, big_endian)
    write_prologue(to_io, big_endian)
    from_io = from.rio

    buffer = ""
    uncompressed = [0].pack('C')
    from.each_block(true, false) do |block_info|
        if block_info.type == STREAM_BLOCK
            copy_block(block_info, from_io, to_io, buffer)
        elsif block_info.type == CONTROL_BLOCK
            # remove the fields in time structure
            to_io.write([block_info.type, block_info.index, block_info.payload_size - 16].pack('CxvV'))
            from_io.seek(block_info.pos + BLOCK_HEADER_SIZE + 4)
            to_io.write(from_io.read(8))
            from_io.seek(4, IO::SEEK_CUR)
            to_io.write(from_io.read(1))
            from_io.seek(4, IO::SEEK_CUR)
            to_io.write(from_io.read(8))
        else
            size_offset = - 16 + 1

            to_io.write([block_info.type, block_info.index, block_info.payload_size + size_offset].pack('CxvV'))
            from_io.seek(block_info.pos + BLOCK_HEADER_SIZE + 4)
            to_io.write(from_io.read(8))
            from_io.seek(8, IO::SEEK_CUR)
            to_io.write(from_io.read(8))
            from_io.seek(4, IO::SEEK_CUR)
            to_io.write(from_io.read(4))
            to_io.write(uncompressed)
            from_io.read(block_info.payload_size - (DATA_HEADER_SIZE - size_offset), buffer)
            to_io.write(buffer)
        end
    end
end

.normalize_metadata(metadata) ⇒ Object



477
478
479
480
481
482
483
# File 'lib/pocolog/logfiles.rb', line 477

def self.()
    result = {}
    .each do |k, v|
        result[k.to_str] = v
    end
    result
end

.open(pattern, registry = Typelib::Registry.new, index_dir: nil, silent: false) ⇒ Object

Opens a set of file. pattern can be a globbing pattern, in which case all the matching files will be opened as a log sequence

Raises:

  • (ArgumentError)


194
195
196
197
198
199
200
201
202
# File 'lib/pocolog/logfiles.rb', line 194

def self.open(
    pattern, registry = Typelib::Registry.new,
    index_dir: nil, silent: false
)
    io = Dir.enum_for(:glob, pattern).sort.map { |name| File.open(name) }
    raise ArgumentError, "no files matching '#{pattern}'" if io.empty?

    new(*io, registry, index_dir: index_dir, silent: silent)
end

.rename_streams(from_io, to_io, mappings) ⇒ Object



112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
# File 'lib/pocolog/convert.rb', line 112

def self.rename_streams(from_io, to_io, mappings)
    from = Logfiles.new(from_io)
    from.read_prologue
    write_prologue(to_io, from.endian_swap ^ Pocolog.big_endian?)

    buffer = ""
    from.each_block(true) do |block_info|
        if block_info.type == STREAM_BLOCK
            stream = from.read_stream_declaration
            write_stream_declaration(to_io, stream.index,
                    mappings[stream.name] || stream.name,
                    stream.type, nil, stream.)
        else
            copy_block(block_info, from_io, to_io, buffer)
        end
    end
end

.to_new_format(from_io, to_io, big_endian = nil) ⇒ Object



62
63
64
65
66
67
68
69
70
71
72
73
74
75
# File 'lib/pocolog/convert.rb', line 62

def self.to_new_format(from_io, to_io, big_endian = nil)
    from = Logfiles.new(from_io)
    from.read_prologue

rescue MissingPrologue
    # This is format version 1. Need either --little-endian or --big-endian
    if big_endian.nil?
        raise "#{from_io.path} looks like a v1 log file. You must specify either --little-endian or --big-endian"
    end
    puts "#{from_io.path}: format v1 in #{big_endian ? "big endian" : "little endian"}"
    from_version_1(from, to_io, big_endian)

rescue ObsoleteVersion
end

.valid_file?(file) ⇒ Boolean

Returns true if file is a valid, up-to-date, pocolog file

Returns:

  • (Boolean)


70
71
72
# File 'lib/pocolog/logfiles.rb', line 70

def self.valid_file?(file)
    Format::Current.valid_file?(file)
end

.write_block(wio, type, index, payload) ⇒ Object

Formats a block and writes it to io



463
464
465
466
467
# File 'lib/pocolog/logfiles.rb', line 463

def self.write_block(wio, type, index, payload)
    wio.write [type, index, payload.size].pack('CxvV')
    wio.write payload
    wio
end

.write_data_block(io, stream_index, rt, lg, compress, data) ⇒ Object



622
623
624
625
626
627
628
629
630
631
632
633
634
# File 'lib/pocolog/logfiles.rb', line 622

def self.write_data_block(io, stream_index, rt, lg, compress, data)
    payload =
        if rt.kind_of?(Time)
            [rt.tv_sec, rt.tv_usec, lg.tv_sec, lg.tv_usec,
             data.length, compress, data]
        else
            [rt / 1_000_000, rt % 1_000_000,
             lg / 1_000_000, lg % 1_000_000,
             data.length, compress, data]
        end
    payload = payload.pack("#{DATA_BLOCK_HEADER_FORMAT}a#{data.size}")
    write_block(io, DATA_BLOCK, stream_index, payload)
end

.write_prologue(to_io, big_endian = nil) ⇒ Object



4
5
6
7
8
9
10
# File 'lib/pocolog/convert.rb', line 4

def self.write_prologue(to_io, big_endian = nil)
    to_io.write(MAGIC)
    if big_endian.nil?
        big_endian = Pocolog.big_endian?
    end
    to_io.write(*[FORMAT_VERSION, big_endian ? 1 : 0].pack('xVV'))
end

.write_stream_declaration(wio, index, name, type_name, type_registry = nil, metadata = {}) ⇒ Object

Encodes and writes a stream declaration block to wio



524
525
526
527
528
529
530
531
# File 'lib/pocolog/logfiles.rb', line 524

def self.write_stream_declaration(
    wio, index, name, type_name, type_registry = nil,  = {}
)
    payload = encode_stream_declaration_payload(
        name, type_name, type_registry: type_registry, metadata: 
    )
    write_block(wio, STREAM_BLOCK, index, payload)
end

Instance Method Details

#closeObject

Close the underlying IO objects



145
146
147
# File 'lib/pocolog/logfiles.rb', line 145

def close
    io.close
end

#closed?Boolean

Returns:

  • (Boolean)


135
136
137
# File 'lib/pocolog/logfiles.rb', line 135

def closed?
    io.closed?
end

#create_stream(name, type, metadata = {}) ⇒ Object

Explicitely creates a new stream named name, of the given type and metadata



559
560
561
562
563
564
565
566
567
568
569
570
571
572
573
574
575
# File 'lib/pocolog/logfiles.rb', line 559

def create_stream(name, type,  = {})
    type = registry.get(type) if type.respond_to?(:to_str)

    registry = type.registry.minimal(type.name).to_xml

    @streams ||= []
    new_index = @streams.size
    pos = io.tell
    Logfiles.write_stream_declaration(
        io, new_index, name, type.name, registry, 
    )

    stream = DataStream.new(self, new_index, name, type, )
    stream.info.declaration_blocks << pos
    @streams << stream
    stream
end

#data(data_header = nil) ⇒ Object

Returns the raw data payload of the current block



448
449
450
451
452
453
454
455
456
457
458
459
460
# File 'lib/pocolog/logfiles.rb', line 448

def data(data_header = nil)
    return @data if @data && !data_header

    data_header ||= self.data_header
    block_stream.seek(data_header.payload_pos)
    data = block_stream.read(data_header.data_size)
    if data_header.compressed?
        # Payload is compressed
        data = Zlib::Inflate.inflate(data)
    end
    @data = data unless data_header
    data
end

#data_headerObject

Reads the header of a data block. This sets the @data_header instance variable to a new DataHeader object describing the last read block. If you want to keep a reference on a data block, and read it later, do the following

block = file.data_header.dup
[do something, including reading the file]
data  = file.data(block)


424
425
426
427
428
429
430
431
432
433
434
435
436
# File 'lib/pocolog/logfiles.rb', line 424

def data_header
    unless @data_header
        raw_header = block_stream.read_data_block_header
        h = DataHeader.new(
            raw_header.rt_time, raw_header.lg_time,
            raw_header.data_size, raw_header.compressed?
        )
        h.block_pos   = @block_pos
        h.payload_pos = block_stream.tell
        @data_header = h
    end
    @data_header
end

#declared_stream?(index) ⇒ Boolean

True if there is a stream index

Returns:

  • (Boolean)


388
389
390
# File 'lib/pocolog/logfiles.rb', line 388

def declared_stream?(index)
    @streams && (@streams.size > index && @streams[index])
end

#each_stream(&block) ⇒ Object

Enumerate this file's streams



597
598
599
# File 'lib/pocolog/logfiles.rb', line 597

def each_stream(&block)
    streams.each(&block)
end

#endian_swapObject

Whether the endianness of the data stored in the file matches the host's (false) or not (true)



65
66
67
# File 'lib/pocolog/logfiles.rb', line 65

def endian_swap
    big_endian? ^ Pocolog.big_endian?
end

#eof?Boolean

True if we read the last block in the file set

Returns:

  • (Boolean)


154
155
156
# File 'lib/pocolog/logfiles.rb', line 154

def eof?
    io.eof?
end

#flushObject

Flush the IO objects



140
141
142
# File 'lib/pocolog/logfiles.rb', line 140

def flush
    io.flush
end

#has_stream?(name) ⇒ Boolean

Returns true if name is the name of an existing stream

Returns:

  • (Boolean)


602
603
604
605
# File 'lib/pocolog/logfiles.rb', line 602

def has_stream?(name)
    stream(name)
rescue ArgumentError # rubocop:disable Lint/HandleExceptions
end

#initialize_copy(from) ⇒ Object

:nodoc:



234
235
236
237
238
239
240
# File 'lib/pocolog/logfiles.rb', line 234

def initialize_copy(from) # :nodoc:
    super

    @io = from.io.dup
    @block_stream = BlockStream.new(@io)
    @registry = from.registry.dup
end

#initialize_from_stream_info(io, stream_info) ⇒ Object

This method is part of a private API. You should avoid using this method if possible, as it may be removed or be changed in the future.

Initialize self by loading information from an index file



326
327
328
329
330
331
332
333
334
335
336
337
338
339
340
341
342
343
344
345
346
347
348
349
350
351
352
353
354
355
356
357
358
359
360
361
362
363
364
365
366
367
# File 'lib/pocolog/logfiles.rb', line 326

def initialize_from_stream_info(io, stream_info)
    block_stream = BlockStream.new(io)

    stream_info.each_with_index.map do |info, idx|
        pos = info.declaration_blocks.first
        block_stream.seek(pos)

        block_header = block_stream.read_next_block_header
        if block_header.kind != STREAM_BLOCK
            raise InvalidIndex,
                  'invalid stream declaration position in index: block '\
                  'is not a stream declaration block'
        elsif block_header.stream_index != idx
            raise InvalidIndex,
                  'invalid stream declaration position in index: '\
                  'stream index mismatch between block header '\
                  "(#{block_header.stream_index}) and expected index #{idx}"
        end
        stream_block = block_stream.read_stream_block

        # Read the stream declaration block and then update the
        # info attribute of the stream object
        unless info.empty?
            pos = info.interval_io[0]
            block_stream.seek(pos)
            block_info = block_stream.read_next_block_header
            if block_info.kind != DATA_BLOCK
                raise InvalidIndex,
                      'invalid stream interval_io in index '\
                      "for stream #{idx}: expected first block at #{pos}, "\
                      'but this is not a data block'
            elsif block_info.stream_index != idx
                raise InvalidIndex,
                      'invalid stream interval_io in index '\
                      "for stream #{idx}: first block at #{pos} found, "\
                      "but it is a block of stream #{block_info.stream_index}"
            end
        end

        [stream_block, info]
    end
end

#joint_stream(use_rt, *names) ⇒ Object

Creates a JointStream object on the streams whose names are given. The returned object is used to coherently iterate on the samples of the given streams (i.e. it will yield samples that are valid at the same time)



611
612
613
614
615
616
# File 'lib/pocolog/logfiles.rb', line 611

def joint_stream(use_rt, *names)
    streams = names.map do |n|
        stream(n)
    end
    JointStream.new(use_rt, *streams)
end

#load_stream_info(ios, index_dir: nil, silent: false) ⇒ Array<DataStream,nil>

Load stream information, either from an existing on-disk index or by rebuilding an index

Returns:



261
262
263
264
265
266
267
268
269
270
271
272
273
274
275
276
277
278
279
280
281
282
283
284
285
286
287
288
289
290
291
292
293
# File 'lib/pocolog/logfiles.rb', line 261

def load_stream_info(ios, index_dir: nil, silent: false)
    per_file_stream_info = ios.map do |single_io|
        load_stream_info_from_file(
            single_io,
            index_dir: (index_dir || File.dirname(single_io)),
            silent: silent
        )
    end
    stream_count = per_file_stream_info.map(&:size).max || 0
    per_stream = ([nil] * stream_count).zip(*per_file_stream_info)

    streams = []
    per_stream.each_with_index do |stream_info, stream_index|
        combined_info = StreamInfo.new
        file_pos_offset = 0
        stream_block = nil
        # We used an array of nil to initialize the zip. Remove it.
        stream_info.shift
        stream_info.each_with_index do |(block, info), file_index|
            stream_block ||= block
            combined_info.concat(info, file_pos_offset) if info
            file_pos_offset += ios[file_index].size
        end

        if stream_block
            streams[stream_index] =
                DataStream.new(self, stream_index, stream_block.name,
                               stream_block.type,
                               stream_block., combined_info)
        end
    end
    streams
end

#load_stream_info_from_file(io, index_dir: File.dirname(io.path), silent: false) ⇒ Array<StreamInfo>

Get the index for the file backed by the given IO

Parameters:

  • io (File)

    the file. It must be a single file

Returns:

  • (Array<StreamInfo>)

    list of streams in the given file



300
301
302
303
304
305
306
307
308
309
310
311
312
313
314
315
316
317
318
319
320
321
# File 'lib/pocolog/logfiles.rb', line 300

def load_stream_info_from_file(
    io, index_dir: File.dirname(io.path), silent: false
)
    index_filename =
        self.class.default_index_filename(io.path, index_dir: index_dir)
    if File.exist?(index_filename)
        Pocolog.info "loading file info from #{index_filename}... " unless silent
        begin
            streams_info = File.open(index_filename) do |index_io|
                Format::Current.read_index(
                    index_io, expected_file_size: io.size, expected_mtime: nil
                )
            end
            return initialize_from_stream_info(io, streams_info)
        rescue InvalidIndex => e
            unless silent
                Pocolog.warn "invalid index file #{index_filename}: #{e.message}"
            end
        end
    end
    rebuild_and_load_index(io, index_filename, silent: silent)
end

#new_file(filename = nil) ⇒ Object

Continue writing logs in a new file. See #basename to know how files are named



173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
# File 'lib/pocolog/logfiles.rb', line 173

def new_file(filename = nil)
    name = filename || "#{basename}.#{num_io}.log"
    io = File.new(name, 'w+')
    Format::Current.write_prologue(io)
    streams.each_with_index do |s, i|
        Logfiles.write_stream_declaration(
            io, i, s.name, s.type.name, s.type.to_xml, s.
        )
    end
    if num_io == 0
        @io = io
    elsif num_io == 1
        @io = IOSequence.new(@io, io)
    else
        @io.add_io(io)
    end
    @block_stream = BlockStream.new(@io)
end

#num_ioObject



127
128
129
130
131
132
133
# File 'lib/pocolog/logfiles.rb', line 127

def num_io
    if io.respond_to?(:num_io)
        io.num_io
    else
        1
    end
end

#openObject



149
150
151
# File 'lib/pocolog/logfiles.rb', line 149

def open
    io.open
end

#read_one_block(file_pos) ⇒ Object

Read the block information for the block at a certain position in the IO



394
395
396
397
398
399
400
401
# File 'lib/pocolog/logfiles.rb', line 394

def read_one_block(file_pos)
    block_stream.seek(file_pos)
    header = block_stream.read_next_block_header
    @block_pos = file_pos
    @data = nil
    @data_header = nil
    header
end

#read_one_data_payload(file_pos) ⇒ Object

Read the data payload for a data block present at a certain position in the IO



405
406
407
408
# File 'lib/pocolog/logfiles.rb', line 405

def read_one_data_payload(file_pos)
    read_one_block(file_pos)
    block_stream.read_data_block_payload
end

#rebuild_and_load_index(io, index_path = self.class.default_index_filename(io), silent: false) ⇒ Object

Go through the whole file to extract index information, and write the index file



371
372
373
374
375
376
377
378
379
380
# File 'lib/pocolog/logfiles.rb', line 371

def rebuild_and_load_index(io, index_path = self.class.default_index_filename(io),
                           silent: false)
    # No index file. Compute it.
    Pocolog.info "building index #{io.path} ..." unless silent
    io.rewind
    stream_info = Format::Current.rebuild_index_file(io, index_path)
    io.rewind
    Pocolog.info 'done' unless silent
    initialize_from_stream_info(io, stream_info)
end

#stream(name, type = nil, create = false) ⇒ Object

Returns the DataStream object for name, registry and type. Optionally creates it.

If create is false, raises ArgumentError if the stream does not exist.



582
583
584
585
586
587
588
589
590
591
592
593
594
# File 'lib/pocolog/logfiles.rb', line 582

def stream(name, type = nil, create = false)
    if (matching_stream = streams.find { |s| s.name == name })
        matching_stream
    elsif !type || !create
        raise ArgumentError, "no such stream #{name}"
    else
        Pocolog.warn_deprecated(
            'the "create" flag of #stream is deprecated, '\
            'use #create_stream directly instead'
        )
        create_stream(name, type)
    end
end

#stream_aligner(use_rt = false) ⇒ Object

Creates a stream aligner on all streams of this logfile



650
651
652
# File 'lib/pocolog/logfiles.rb', line 650

def stream_aligner(use_rt = false)
    StreamAligner.new(use_rt, *streams.compact)
end

#stream_from_index(index) ⇒ Object

Returns a stream from its index



383
384
385
# File 'lib/pocolog/logfiles.rb', line 383

def stream_from_index(index)
    @streams[index]
end

#stream_from_type(type) ⇒ Object

Returns a stream of the given type, if there is only one. The type can be given by its name or through a Typelib::Type subclass

If there is no match or multiple matches, raises ArgumentError.



544
545
546
547
548
549
550
551
552
553
554
555
# File 'lib/pocolog/logfiles.rb', line 544

def stream_from_type(type)
    matches = streams_from_type(type)
    if matches.empty?
        raise ArgumentError,
              'there is no stream in this file with the required type'
    elsif matches.size > 1
        raise ArgumentError,
              'there is more than one stream in this file with the required type'
    else
        return matches.first
    end
end

#streams_from_type(type) ⇒ Object

Returns all streams of the given type. The type can be given by its name or through a Typelib::Type subclass



535
536
537
538
# File 'lib/pocolog/logfiles.rb', line 535

def streams_from_type(type)
    type = type.name if type.respond_to?(:name)
    streams.find_all { |s| s.type.name == type }
end

#sub_field(offset, size, data_header = self.data_header) ⇒ Object



438
439
440
441
442
443
444
445
# File 'lib/pocolog/logfiles.rb', line 438

def sub_field(offset, size, data_header = self.data_header)
    if data_header.compressed
        raise 'field access on compressed files is unsupported'
    end

    block_stream.seek(data_header.payload_pos + offset)
    block_stream.read(size)
end

#tellObject

Returns the current position in the current IO

This is the position of the next block.



167
168
169
# File 'lib/pocolog/logfiles.rb', line 167

def tell
    block_stream.tell
end

#write_block(type, index, payload) ⇒ Object

Write a raw block. type is the block type (either CONTROL_BLOCK, DATA_BLOCK or STREAM_BLOCK), index the stream index for stream and data blocks and the control block type for control blocs. payload is the block's payload.



473
474
475
# File 'lib/pocolog/logfiles.rb', line 473

def write_block(type, index, payload)
    Logfiles.write_block(io, type, index, payload)
end

#write_data_block(stream, rt, lg, data) ⇒ Object

Write a data block for stream index stream, with the provided times and the given data. data must already be marshalled (i.e. it is meant to be a String that represents a byte array).



639
640
641
642
643
644
645
646
647
# File 'lib/pocolog/logfiles.rb', line 639

def write_data_block(stream, rt, lg, data) # :nodoc:
    compress = 0
    if compress? && data.size > COMPRESSION_MIN_SIZE
        data = Zlib::Deflate.deflate(data)
        compress = 1
    end

    Logfiles.write_data_block(io, stream.index, rt, lg, compress, data)
end