Using tbl and src_monetdblite to access data

Sorry if this question has been asked elsewhere, I can't find it. I'm working through some basic examples in MonetDBLite.

> dbGetQuery(dbcon, "SELECT MAX(mpg) FROM mtcars WHERE cyl = 8")
L3
1 19.2

works, but

> ms <- MonetDBLite::src_monetdblite("./DB")
> t <- tbl(ms, "mtcars")
Error in UseMethod("tbl") : 
no applicable method for 'tbl' applied to an object of class
"c('src_monetdb', 'src_sql', 'src')"

It seems that it's trying to assign the db to t not the table.

Any suggestions would be greatly appreciated.

I've been perusing resources and found a useR2016 presentation and noticed a difference here:

> ms
src:  MonetDBEmbeddedConnection
tbls: mtcars

Curious...

I'm a huge fan of using MonetDBLite together with dplyr. My addition to Hannes Mühleisen's (thanks for the package!) answer would be that it appears that the order you load the packages can matter. Loading MonetDBLite after dplyr and dbplyr seems to be the key for me. Loading MonetDBLite first causes errors similar to the one nzgwynn noted.

Sometimes I could connect to the database with no problems. Other times I would get error messages like:

Error in UseMethod("db_query_fields") : no applicable method for 'db_query_fields' applied to an object of class "MonetDBEmbeddedConnection"

Like nzgwynn, I was puzzled about why it would work sometimes but not others. Restarting and reinstalling wouldn't necessarily fix it for me.

This clue, from an issue filed about sparklyr, lead me to explore the package loading order:

https://github.com/rstudio/sparklyr/issues/38

Like noted there with sparklyr, and I've noticed with other R database packages, MonetDBLite will load and attach automatically if the Global Environment already contains a connection object. My problem was that I had an src_monetdb object in my workspace, which was causing MonetDBLite to load upon starting RStudio. So I while I thought I was loading it after dplyr and dbplyr, it was really loading first. If I clear the workspace and then restart, I can load the packages in the preferred order. So far, this method has worked.

I've seen starting with a clean workspace advised as good practice generally, e.g.: https://twitter.com/hadleywickham/status/561146907519500288. Starting with a fresh workspace loses you no time either given MonetDBLite's speedy query ability.

Lastly, I would put a enthusiastic pitch in for using MonetDBLite. I saw it mentioned on RStudio's database page and was immediately impressed on how easy it was to setup and how fast it is. It's the best way I've found for working with a ~2 GB dataset in R. When exploring the data interactively, the dplyr queries run so quickly that it feels like I'm working with the data in memory. And if all I want to do is load the whole dataset into memory, MonetDBLite is as fast or faster than other methods I've tried like read.fst() from the fst package.

Use src_monetdb to connect to an existing MonetDB database, and tbl to "​dplyrdir") my_db <- MonetDBLite::src_monetdblite(dbdir) # copy some data to DB​  Using tbl and src_monetdblite to access data Sorry if this question has been asked elsewhere, I can't find it. I'm working through some basic examples in MonetDBLite.

I closed R and opened it again and the same coding worked fine...

Due to the inefficiencies of shipping data around, MonetDB also supports 8") library(dplyr) ms <- MonetDBLite::src_monetdblite(dbdir) mt <- tbl(ms, Finally, another common use case is to retrieve an entire table from the  Use src_monetdb to connect to an existing MonetDB database, and tbl to connect to tables within that database. Please note that the ORDER BY, LIMIT and OFFSET keywords are not supported in the query when using tbl on a connection to a MonetDB database. If you are running a local database, you only need to define the name of the database you want to connect to.

You need to call library("dplyr") before using tbl and friends. Also make sure you have dbplyr installed.

Update: Also, please make sure there is no connection object (src) in a stored workspace loaded at startup. Loading connections from .Rdata files does not work! Instead, create the connection/src from scratch every time you run a script.

Using tbl and src_monetdblite to access data DB") > t <- tbl(ms, "mtcars") Error in UseMethod("tbl") : no applicable method for 'tbl' applied to  2 Using tbl and src_monetdblite to access data Nov 14 '18. 0 Can't access existing MonetDB with dplyr Nov 14 '18. Badges (5) Gold

immunarch comes with its own data format, including tab-delimited columns that can be specified as follows: temporary directory with MonetDB ms = MonetDBLite::src_monetdblite(dbdir to dplyr tables for (i in 1:length(DATA)) { res_db[[names(DATA)[i]]] = dplyr::tbl(ms, Get subset of clonotypes with a specific V gene. The Large Number data type stores a non-monetary, numeric value and is compatible with the SQL_BIGINT data type in ODBC. Use this data type to efficiently calculate large numbers. You can add it as a field to an Access table. You can also link to or import from databases with a corresponding data type, such as the SQL Server bigint data type.

It uses dplyr databases as the backend, not the survey package's database Rdata") # acs_m data # Set up database and table db <- src_monetdblite() acs_m_db stored in the database, you could do this # acs_m_data <- tbl(db, Further, sometimes working with variable types can get difficult if you are  SQL Server Migration Assistant (SSMA) is a tool that provides a comprehensive environment that helps you quickly migrate Access databases to SQL Server or SQL Azure. By using SSMA, you can review Access and SQL Server or SQL Azure database objects, assess the Access database for migration, convert Access database objects, load them into SQL Server or SQL Azure, and then migrate data.

Get PDF of this article Another solution would be to calculate the TBL in terms of an index. While there is significant literature on the appropriate measures to use for sustainability at the state or national levels, in the end, data availability  Organize data with the Table Analyzer . You can use the Table Analyzer Wizard to quickly identify redundant data. The wizard then provides a simple way to organize the data into separate tables. Access preserves the original table as a backup. Open the Access database that contains the table that you want to analyze.

Comments
  • Thanks for the thorough answer!! Do you have any idea why the order that packages are loaded is important? What does this do to the environment? Also, if I have the data (D) in memory, then use D<- data.table(D) and run it through tidyverse it works quickly too.
  • Hi @nzgwynn, I'm glad you posted the question because I don't think I would have got it working otherwise. Why the order matters, I don't know... I'm just a grateful user of all this great free technology. In reading about similar issues, my guess is that there is some conflict between the packages. Where packages use the same name for a function or object, the package that loads later "masks" the objects loaded before it. So that was my working theory--loading MonetDBLite last would give it precedence over the other packages. Yet that theory could be off...
  • I spoke with someone who said that dplyer and dbplayer change the pointers around which can make it difficult for R (and other packages) to find things.
  • They are! '> library("dbplyr") > library("dplyr") > mt <- tbl(ms, "mtcars") Error in UseMethod("tbl") : no applicable method for 'tbl' applied to an object of class "c('src_monetdb', 'src_sql', 'src')" '
  • It works perfectly on my Ubuntu VM, but not locally on my Windows10. Why would that be?
  • OMG!! I closed R and started it again and it worked!! WTF?? I don't understand why it didn't work before though...