ocaml bitstring within a script - ocaml

In ocaml toplevel, I can use "bitstring" package by typing the following commands:
#use "topfind";;
#camlp4o;;
#require "bitstring.syntax";;
let data = 0l;;
let bits = BITSTRING { data : 32 };;
However, if I create an OCaml script, e.g., foo.ml :
#!/usr/bin/env ocaml
#use "topfind";;
#camlp4o;;
#require "bitstring.syntax";;
let data = 0l;;
let bits = BITSTRING { data : 32 };;
And, if I run the OCaml script, I got a syntax error:
$ ./foo.ml
File "./foo.ml", line 8, characters 28-29: Error: Syntax error
What am I missing here? Why does the same code work with an interactive shell, but not with a script file?

I believe that's because script file is first parsed, and then the directives are executed, hence it cannot handle unknown (yet) syntax.
The easiest solution is to use ocamlscript :
#! /usr/bin/env ocamlscript
Ocaml.packs := [ "bitstring"; "bitstring.syntax" ]
--
let data = 0l;;
let bits = BITSTRING { data : 32 };;

Related

Type error when passing Ocaml a file from the command line but not in repl

My code gives an error when run from a file but runs fine when pasted into the Ocaml repl. I have saved the following as test.ml:
module StringSet = Set.Make(String)
let words = StringSet.add StringSet.empty "something";;
When run from bash with "ocaml test.ml" I get:
File "test.ml", line 3, characters 26-41:
Error: This expression has type StringSet.t = Set.Make(String).t
but an expression was expected of type StringSet.elt = string
When pasted into the Ocaml repl I get:
# module StringSet = Set.Make(String)
let words = StringSet.add StringSet.empty "something";;
module StringSet :
sig
(* ... much more output ... *)
end
val words : StringSet.t = <abstr>
#
Everything seems to work fine from the repl.
My Ocaml version is reported by the repl as: OCaml version 4.02.1.
Does anyone know why the error is produced when running "ocaml test.ml"?
The StringSet.add function takes the element (the string) as the first parameter and the set (StringSet.empty) as the second parameter. You have them in the opposite order.
When I try your code I get the same error in both cases. When I invert the parameter order I don't get an error in either case.
I'm using OCaml 4.06.0, but I would really doubt that the paramter order has changed.

How to get system page size in OCaml

I have searched in modules Sys, Gc, and Unix, but did not find a means to get the system page size in OCaml. How can we get the system page size?
I have OCaml 4.06 and macOS 10.12.6 (Sierra)
If you just want the answer for MacOS, there is a pagesize command that you can run with Unix.open_process_in:
$ rlwrap ocaml
OCaml version 4.06.0
# #load "unix.cma";;
# Unix.open_process_in "pagesize" |> input_line |> int_of_string;;
- : int = 4096
Update
There is a POSIX command line program getconf that is quite portable, I believe. It works on macOS and all the versions of Linux I tried. You can use that instead:
$ rlwrap ocaml
OCaml version 4.06.0
# #load "unix.cma";;
# Unix.open_process_in "getconf PAGE_SIZE" |> input_line |> int_of_string;;
- : int = 4096
You can call sysconf(SC_PAGESIZE) from ocaml to get that information. You can either do that using a .c file, or using ctypes (although you'll need the value of the SC_PAGESIZE, so it might not be the best solution):
% utop -require ctypes.foreign
# open Foreign;;
# open Ctypes;;
# let sysconf = foreign "sysconf" (int #-> returning long);;
val sysconf : int -> Signed.long = <fun>
# sysconf 30;;
- : Signed.long = <long 4096>

Evaluating code after parsing it

I'm trying to create a tool written in Python that executes R scripts (from files), injecting values into variables before executing them and reading output variables after that.
The rinterface documentation mentions the parse function, but there is no indication about how to execute the result. The C interface contains an eval function but it doesn't seem available in Python.
Here's a very basic example of what I want to do :
import rpy2.rinterface as ri
ri.initr()
with open('script.r', 'r') as myFile:
script = myFile.read()
expr = ri.parse(script)
# prepare
ri.globalenv['input'] = ri.IntSexpVector((1, 2, 3, 4))
# execute
#??????????????????
# what to do here ?
#??????????????????
# fetch results
# The script is supposed to store results into a global var named 'output'
result = ri.globalenv['output']
Thanks
There are several ways.
One is:
from rpy2.robjects.packages import importr
base = importr('base')
base.eval(expr)

Export user input history from OCaml utop to file

When I'm using OCaml utop every line of the input and output is printed into the console:
───┬──────────────────────────────────────────────────────────────┬───
│ Welcome to utop version 2.10.0 (using OCaml version 4.14.0)! │
└──────────────────────────────────────────────────────────────┘
Type #utop_help for help about using utop.
─( 22:17:51 )─< command 0 >─────────────────────────────{ counter: 0}─
utop # let x = 50;;
val x : int = 50
Is it possible to export each of user inputs and outputs from utop session into specific file?
Your input is saved by default at ~/.utop-history .
Isn't that enough for your purpose?
You can also change location and file size by manipulating UTop.history_file_name and other variables.

OCaml int to binary string conversion

What's the easiest way to convert an Int32.t to binary? For example:
-1 -> "\255\255\255\255" ?
Edit:
To use extlib, install it with yum and in the toplevel:
#use "topfind";;
#require "extlib";;
I would suggest using Bitstring for this kind of thing. You can find it here.
For example, in the toplevel:
# #use "topfind";;
# #camlp4o;;
# #require "unix";;
# #require "bitstring.syntax" ;;
# let data = Int32.of_int (-1);;
# let bits = BITSTRING { data: 32 } ;;
then you can perform various conversions on the bitstring including writing it to a binary file or to stdout or to a string:
# Bitstring.string_of_bitstring bits ;;
- : string = "\255\255\255\255"
Use extlib:
# let io = IO.output_string ();;
val io : string IO.output = <abstr>
# IO.write_i32 io (-1);;
- : unit = ()
# IO.close_out io;;
- : string = "\255\255\255\255"