• Announcements

    • HGCS UPGRADE   07/02/2015

      The upgrade and new version of the forum software did away with a separate username and display name. Some of you may be logging in with a username but your Display Name is different. So if you are one of these users and have trouble logging in use your email address you registered with or your Display Name. If you still have trouble use the Contact Us link at the bottom of the forum with your User or Display name and I will try and get you squared away. -Team Four Paw
Sign in to follow this  
Followers 0

GSAK macro question

42 posts in this topic

Posted · Report post

What macro does everyone use to populate the favorite points for a cache into your database?

Share this post


Link to post
Share on other sites

Posted · Report post

What's a macro?

Share this post


Link to post
Share on other sites

Posted (edited) · Report post

Will that also load the latest log entries? I wonder why I don't see the latest fairly often when I am in the field. (and here I was thinking a macro was a smaller micro - go figger!) Edited by Baytown Bert

Share this post


Link to post
Share on other sites

Posted · Report post

Yes it will unless you choose to do it in "light access" mode which is faster, but gets less data.

 

You can choose how many logs to get (upto 30) for each cache.

Share this post


Link to post
Share on other sites

Posted · Report post

Yeah, I normally use the 'Light Access', note that you're limited to 10K of those per day.  

 

Note that if you're just after logs, there's also 'Get Recent Logs' under the 'Geocaching.com Access' and you can get an unlimited number of logs if you wish.  I find it useful to fully flush the statistics of my hide list.

KeyResults likes this

Share this post


Link to post
Share on other sites

Posted · Report post

Thanks folks!

Share this post


Link to post
Share on other sites

Posted (edited) · Report post

A couple of question for some of you hard-core cachers and GSAK users (yes, I'm talking to you TexasWriter and KeyResults!)  As the proud owner of a new-to-me Montana, I've decided to set up my pocket queries to essentially download all of the caches I haven't found in the greater Houston area and keep them in the Montana, updating them once a week or so.  That'll be something like 7000 caches.  I've created about 10 PQ's to do this.  If I want to just update the existing database with changes, can I use a PQ that only looks for caches that have been "Updated in the last 7 days?" (see attached thumbnail).  I set a PQ like that with a 70 mile radius from Houston and it pulled in 961 caches, so It seems plausible.

 

Is there a GSAK macro to update the database and remove caches I've found or that have been archived, etc, then automatically filter and create a new GPX file?  What do you guys use?  I'm sure everyone does it a little differently so any input is appreciated.

 

[attachment=565:CaptureGSAK PQ.JPG]

Edited by HoustonControl

Share this post


Link to post
Share on other sites

Posted · Report post

I use the LastUpdateGPX macro to filter a user selectable number of caches that are the oldest to have been updated. I then use the "Refresh Cache Data" menu item under geocaching.com access at the top of GSAK to update those caches.

 

I use the the "all in current" and "full details" option to update the selected caches.

 

Any caches you have found will be marked in yellow (dbl click on the yellow square at the bottom of GSAK and it will auto filter your finds) and can be deleted or moved to a "found" database. You can do the same thing on the small red square to filter on archived caches as well

 

I use the garminexport macro to build the new gpx file, this macro will allow you to configure what data is in the gpx file and build it from there.

 

If you select the macro>run/manage menu item there is a link in the upper left of that window that will take you to the GSAK macro site where all of these can be found and hundreds more to perform various tasks.

 

My instruction kinda suck, I am better at demo-ing it than describing it. One of the previously mentioned GSAK gurus will do a better job of describing how to do it

Share this post


Link to post
Share on other sites

Posted · Report post

I'm familiar with the GSAK Macro site and how to find and install them (I have about 20 installed that I use frequently).  I'll check out the LastUpdateGPX macro and the garminexport macro.

Share this post


Link to post
Share on other sites

Posted · Report post

HC, I do what you are talking about, I think. Since I'm on phone and I'm terrible thumb typist I'll come back when on keyboard.

Share this post


Link to post
Share on other sites

Posted · Report post

I used a macro posted by GreatBirds a long time ago to write my own update macro.  It downloads all my active PQs and then filters for older GPX caches and then updates those via Live.  Then it deletes archived caches and then exports them to Dropbox for me to dump into Geosphere.  It is pretty sweet and has dropped my management time with GSAK considerably.

 

The next step is to write a sub routine to filter and delete unattended Events that you people refuse to archive after many weeks.  :tickedoff:  :stirpot:

Share this post


Link to post
Share on other sites

Posted · Report post

Ha ha!  You said "you people".  :angel:

Share this post


Link to post
Share on other sites

Posted · Report post

FavCacheCounter.gsk counts favorite points for caches.  It took over an hour to count our 14K+ caches.

Share this post


Link to post
Share on other sites

Posted · Report post

HC, I do what you are talking about, I think. Since I'm on phone and I'm terrible thumb typist I'll come back when on keyboard.

Well???  Find that keyboard yet? :coolsmiley:

Share this post


Link to post
Share on other sites

Posted · Report post

A couple of question for some of you hard-core cachers and GSAK users (yes, I'm talking to you TexasWriter and KeyResults!)  As the proud owner of a new-to-me Montana, I've decided to set up my pocket queries to essentially download all of the caches I haven't found in the greater Houston area and keep them in the Montana, updating them once a week or so.  That'll be something like 7000 caches.  I've created about 10 PQ's to do this.  If I want to just update the existing database with changes, can I use a PQ that only looks for caches that have been "Updated in the last 7 days?" (see attached thumbnail).  I set a PQ like that with a 70 mile radius from Houston and it pulled in 961 caches, so It seems plausible.

 

Is there a GSAK macro to update the database and remove caches I've found or that have been archived, etc, then automatically filter and create a new GPX file?  What do you guys use?  I'm sure everyone does it a little differently so any input is appreciated.

 

attachicon.gifCaptureGSAK PQ.JPG

HC,

 

Sorry for the delay responding. I had a “Senior Moment”.

 

Like you, we maintain a DB of all area geocaches too. It covers a wide area and is about 22 K caches or so. Why?, it’s a fantastic tool for analysis, planning hides, trips, routing, and all kinds of stuff. The problem is you need to keep that DB current.

 

We also tend to keep multiple databases available on our phones. Since our Garmin 62s' are limited to 5000 caches at a time, we would need to use multiple microSD cards to have more than one 5K area available in the field for the handheld 62s without a PC.

 

Anyway, like you, we have a series of geographic Pocket Queries that together cover the Greater Houston area. The PQs run regularly, either once or twice per week.

GSAK V8 makes is so easy to download and automatically freshen our Database. We really don't see a need for custom macro anymore.

 

ONE PQ TO RULE THEM ALL

It doesn’t exist. Though we've gotten close. Regarding the one magic PQ to keep the GSAK V8 DB fresh: we have two PQ’s that we’ve experimented with using the Published, and Last Updated, flags in last 7 days. Our problem with those is that when we were not getting consistent, confidence inspiring, results. The keys are “Is enabled” and “is not enabled” flags. I think part of the problem has been dates being wonky on some hides for some reason. The actual meaning of "Published" and "Updated" seem inconsistent too. Anyway, we don’t yet rely on this right now. 

 

PRIMARY DB UPDATE METHOD (95 percent)

Many standard Pocket Queries (PQs), all of which cover the Greater Caching Areas we are maintaining.

We’ve tuned and tweaked a whole bunch of PQ’s to cover the a very wide area, divided into 1000 cache segments.

These are scheduled to run automatically on a regular schedule, some more than once per week closest to home area.

So, with current PQ's always online, my GSAK pulls all of my latest PQ's at once fairly quickly using the "Download PQ's" feature.

This generally takes care of 95 percent of my GSAK Database needs, including myFinds not logged via GSAK. 

 

SECONDARY METHOD (100 percent accurate in an area)

My preferred method, albeit slower, is to use zone updates just before caching dates, trips, or events. This process uses map coordinates to pull updates.

Step one is to create a filter of a caching area by polygon, rectangle, or circle coordinates, then use then use the Get Caches feature from within GSAK. To update status, logs, etc. You are limited to 6K-10K api calls per day and it is a little slow some days because GC.com severely regulates it right now. Anyway, just before going out to, say Baytown, on a cache run, I’ll pop open GSAK, I’ll create a coordinate shape around the area, then use Get Caches to pick up new ones and update everything else I already have in my DB. 6000 caches takes about 15mins.

 

TO THE DEVICES

With fresh current GSAK DBs we then generate a gpx file (Export GPX) of our GSAK Filter save that gpx onto xD card (Garmin GPSr), AND into DropBox for our phone apps to grab. If we’re out on a multi-area run, then we gen up multiple gpx files for each area. If on a road trip, we just just take the laptop! ;)

Share this post


Link to post
Share on other sites

Posted · Report post

A couple of question for some of you hard-core cachers and GSAK users (yes, I'm talking to you TexasWriter and KeyResults!)  As the proud owner of a new-to-me Montana, I've decided to set up my pocket queries to essentially download all of the caches I haven't found in the greater Houston area and keep them in the Montana, updating them once a week or so.  That'll be something like 7000 caches.  I've created about 10 PQ's to do this.  If I want to just update the existing database with changes, can I use a PQ that only looks for caches that have been "Updated in the last 7 days?" (see attached thumbnail).  I set a PQ like that with a 70 mile radius from Houston and it pulled in 961 caches, so It seems plausible.

 

Is there a GSAK macro to update the database and remove caches I've found or that have been archived, etc, then automatically filter and create a new GPX file?  What do you guys use?  I'm sure everyone does it a little differently so any input is appreciated.

 

attachicon.gifCaptureGSAK PQ.JPG

 

I'm just a padawan of the might KeyResults, so I will gladly play second fiddle on this one. I've experimented with most of what he listed above, but because my caching habits are a bit different, I use alternate methods that are less automated in some cases. That's mainly because of FTF-chasing. Since I watch everything that publishes in North America (according to rumors I've heard), I suck into GSAK all new caches that publish in range of my main DB, which goes up to Huntsville-ish on the north side, down to Galveston on the south, over to Anahuac-ish on the east, and over to Sealy-ish on the west. If I'm going to be caching in an area (outside of chasing FTF's in said area), I will refresh that area in GSAK using the polygon isolation method Kenny mentioned and refresh the caches in that filter. I also periodically refresh all unfound caches in my main DB with two simple filters. I use less than 40 miles (from my home location) for one filter, and more than 40 miles for the other. There is so much activity with disabling, archiving, etc. that happens sporadically, I refresh both filters about once a week. That same "40-mile and less" range keeps the count around 4,500 for loading onto my 62S, and I don't have to filter at all for my Montana, so I can use it if I find myself outside of my 40-mile range. My notification areas are a completely different story.... :)

 

Hopefully this helps, but I'm thinking some set of Kenny's methods will align more closely with your preferences. I could be wrong, though. I do use a lot of different macros for data mining, filtering, querying, etc., though....

Share this post


Link to post
Share on other sites

Posted · Report post

Thanks Kenny!  I've basically been doing something similar to what you outlined -- minus the zone update part, but I may get there eventually.  So, do you maintain the same database and just overwrite it with new PQ's?  I've been doing that, but I have to update the caches in areas I've recently cached, then filter out the caches I've found or have been archived.  Kind of time consuming, but I guess I could find or build a macro for that.

Share this post


Link to post
Share on other sites

Posted · Report post

So, do you maintain the same database and just overwrite it with new PQ's?

 

 

Yes, I keep one master DB including everything for the area. It's got everything. I do have other DBs, but 80 percent of my needs are best served by one DB with all in it. GSAK V8 is so versatile and fast I see no advantage in keeping everything separate most of the time. That's just me. Backups VERY important!

 

I let the GSAK "Download Pocket Queries" function take care of everything for me. One click and all my PQ's are downloaded, and sucked into the GSAK DB very quickly. I generally overwrite, unless the GSAK record is newer for some reason. This does a great job of keeping caches current, adds new ones, and even takes care of local area My_Finds if I logged outside of GSAK by smartphone impromptu, which happens from time to time. 

 

Note: you do need to incorporate your found and owned caches in your automated Pocket Queries at GC.com otherwise you'll miss those ;) You also need to include disabled caches too. Basically, you want no filters for this method to work well through GSAK.

 

I find this to be zero maintenance required. None. Counts always match, cache data always current.

 

Just as an extra check on my data, after I run the routine, I will do a quick "last updated BEFORE today" filter on GSAK and then do a quick "Update Caches" on everything that's left just to be sure I haven't missed anything.

 

I must say, GSAK v8, with the GC.com API menu choices, has really changed good utility tool into an extraordinary tool. I wish there were a GSAK Fest event where everybody could show off all their ways of doing stuff and treasures they have found. I learn something cool from somebody almost every time we talk about it at events, or online here.

 

I hope this is useful.

Share this post


Link to post
Share on other sites

Posted · Report post

Oh.  My PQ's are all set to only bring in caches I don't own and haven't found.  And everything except mystery caches, which I keep in a separate database.  I didn't see a reason to take up PQ space redownloading and updating all the caches I've already found.  I keep a separate My Finds database for that.  I'll have to mull on that a bit.

 

Thanks for the insight.

Share this post


Link to post
Share on other sites

Posted · Report post

Well, for all you guys with a LOT more finds than we have logged, it may be too expensive from a payload perspective to include your FOUNDs all of the time. Especially if you log from GSAK as a rule. If you wish to keep those FOUND caches updated even after you've found em you can always filter em and click "Update Caches" which will capture both status and recent logs. I have found a few occasions when I need current data on My_Finds loaded so that I can cache with a group or my BFF that haven't logged them yet.

 

I see no benefit trying to keep multiple DBs updated. For example, isolating a current My_Finds filter is one-click activity in the latest version of GSAK. V8 has changed everything. When I need a separate DB I simply generate one on the fly with a couple clicks. It's much easier to maintain one DB, and performance is quite acceptable with a 25,000 row DB. And, that is on my "Sanford and Son" old laptop PC (not a 486, but close)

 

For you guys with lots of hides, again, it's no big deal to isolate them, and keep those logs fresh with a click, if you don't want to have a separate PQ for Owned by, or include Owned by Me in your scheduled geographic PQ's.

 

Whichever way you choose, a macro may reduce a few clicks, but so far, at my level and caching intensity (antithesis of Texas Writer), a macro doesn't do much for me in this regard.  My macros tend to be reporting, analysis, conversion, and presentation utilities. Not so much for automation right now.

Share this post


Link to post
Share on other sites

Posted · Report post

I love learning everyone's GSAK V8 workflow. I keep finding new discovery with the dialog. Sitting with Muddy Buddies at an event earlier this year I realized what a different animal this stuff becomes when you have logged more than 10K caches in an area! Where you sit is where you stand.

Share this post


Link to post
Share on other sites

Posted (edited) · Report post

The reason I keep mystery/puzzle caches in a separate database is I use the Corrected Coordinates feature to mark those that I have solved (and the listed coords for challenge caches), then filter on that.  Plus I can update that dB with one query set to go out 50 miles or so.  I stick the resulting GPX file in the GPSr's and pretty much forget about it until I get around to solving more puzzles -- or more interesting challenges get posted.  Hey, it works for me!

 

Also, I DO keep the results from found puzzles in that database because you never know when that might come in handy.

Edited by HoustonControl
KeyResults likes this

Share this post


Link to post
Share on other sites

Posted · Report post

I keep 3 DB's. A "default" (the one I use regularly), A "founds" and a "Suspect". The suspect one is where I stash temp disabled or a high DNF rate caches and I update those via the GSAK update cache routine about every 2 weeks and move any that have been reactivated or started to get found again (CO did some maint, etc) back to my default DB

 

I could keep them all together and use a filter to list only the active caches but I prefer them out of sight and to keep the row count in my default DB as low as possible.

 

I run several macros against the default DB to strip out attributes and to populate in a custom field with text codes for the last 4 finds ( i.e. "F" for found, "D" for DNF, "O" for Owner maintenance, "C" for coord changes, etc) as I export those codes as a first log entry in my GPX file so I can quickly see the recent find history / date / hint before I ever start looking.

 

Example first log entry below:

 

L4: FFFF
LF: 11/19/2013
Hint: Don't get electrified

KeyResults likes this

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!


Register a new account

Sign in

Already have an account? Sign in here.


Sign In Now
Sign in to follow this  
Followers 0