Quantcast
Channel: dBforums – Everything on Databases, Design, Developers and Administrators
Viewing all 13329 articles
Browse latest View live

Correct Way to Split a Microsot Access Database

$
0
0
Hello All,

I am in need of splitting my database and have a couple of questions so that I can split the database in order to get the BEST performance possible.


I have the database on a server. Should I split the database directly on the server and save the back-end on my local hard drive OR should I split the database on the server and leave the back-end on the server?


Once the database is split, do I remove the tables from the front-end of the database and just leave the forms/queries/macros in the front-end OR do I have to leave all objects in both the front-end and back-end of the database?

Dynamical switching of one database to other database

$
0
0
HI,

I am very new to sybase. can someone kindly let me know , how to switch from one database to other database inside a procedure dynamically in Sybase. In SQL server like use database command.


Thanks

Trigger not working for Update Event

$
0
0
I have a Audit Trigger created which is as below. I'm using PostgreSQL 9.3.

The Trigger works perfectly if i execute from Backend but when I do the same from Frontend,It works fine for Insert & Delete but fails for Update..

Also in my code,I hav given inserts to two tables mapped by Foreign key relation.For Update event,I get wrongly updated columns and sometimes entry is there in 2nd table(query_relation) but for that there is no entry in 1st table(audit_track).

Here is mine trigger function

create or replace function audit_track() returns trigger
as $body$

begin
if tg_op = 'UPDATE' then

insert into audit_track (audit_id,table_name, audit_by,audit_date, action, old_values, new_values, updated_columns)
values (txid_current(),tg_table_name::text,new.audit_by,n ow(), 'Update', svals (hstore(old.*) - hstore(new.*)) , svals (hstore(new.*) - hstore(old.*)),

skeys (hstore(new.*) - hstore(old.*)));
insert into query_relation values(txid_current(),current_query(),now());
return new;
elsif tg_op = 'DELETE' then
insert into audit_track (audit_id,table_name, audit_by,audit_date, action, old_values,updated_columns)

values (txid_current(),tg_table_name::text, old.audit_by,now(), 'Delete', svals (hstore(old.*)),skeys (hstore(old.*)));
insert into query_relation values(txid_current(),current_query(),now());
return old;
elsif tg_op = 'INSERT' then

insert into audit_track(audit_id,table_name, audit_by,audit_date, action, new_values,updated_columns)
values (txid_current(),tg_table_name::text, new.audit_by,now(), 'Insert', svals (hstore(new.*)),skeys (hstore(new.*)));
insert into query_relation values(txid_current(),current_query(),now());
return new;
end if;
EXCEPTION
WHEN data_exception THEN

insert into exception_details values(current_query(),sqlerrm,now());
RETURN New;
WHEN unique_violation THEN

insert into exception_details values(current_query(),sqlerrm,now());
RETURN New;
WHEN OTHERS THEN

insert into exception_details values(current_query(),sqlerrm,now());
RETURN New;
end;
$body$
language plpgsql;

Am i doing something wrong for update event to fail from frontend but to work properly if done manually from backend???

incompatible data type in criteria expression

$
0
0
(title is supposed to be; "Data type mismatch in criteria expression")

Hello to all,

I have a small problem. In my database, my before last query has a field called "OTD Réel" and this field calculates my On-Time-Delivery. The expression works and it does the calculation absolutely fine.

However, i want this field to be used in my last query but when i try to insert it into my query i get the following message:"incompatible data type in criteria expression". I'm confused because I've never had this problem before with other expressions and it would be a huge setback to have to assemble all of the tables required to make this calculation.

So why can't i add this calculated field to my last query and how do i overcome this problem ?

Thank you for any input :D

DELETE / RUNSTAT / REORG not freeing disk space

$
0
0
Hello All,
New to DB2 world, so please dont mind my first Q if its too basic. We are trying to DELETE old data from few tables (biggest tables in our DB) - tables in DMS Large table space and then ran RUNSTAT and REORG on ALL tables but its not freeing filesystem space. Our instance dir filesystem (df output) is still sitting at same usage as before DELETE / RUNSTAT / REORG.

1) Does that mean db2 do not really free filesystem space after DELETE / RUNSTAT / REORG ? or I missed steps or was it due to some problem in our side?

2) What can be done to make sure the filesystem space ( for instance home dir) is not containing any junk data? Any thing we can check?

Stella Z

Interested in joining the Joberate team? We're hiring Data Engineers (Lithuania)

$
0
0
Joberate is an early stage startup in the HR Technology predictive analytics space. We’ve developed a disruptive approach to quantifying, measuring, and scoring job-seeking behavior of the global workforce.

Today Joberate technology helps corporate and socially minded companies gain actionable people insights by tapping the digital footprint of the global workforce, to help companies reduce employee attrition, retain and engage their valuable employees, improve workforce planning, and gain a competitive advantage in recruitment.

Joberate is looking for an experienced data engineer who is passionate about working with data that will have a significant global impact to work out of our Vilnius, Lithuania location. In this position, you will have the opportunity to work all kinds of data to build the cutting edge, Big Data platform for people analytics.

Responsibilities:
• Design, build, and support a new data platform
• Create and support various ETL scripts, jobs, and technologies
• Support APIs, tools, and 3rd party products to extract data
• Tune and optimize data storage technologies
• Provide ongoing data quality monitoring and support

Basic qualifications:
• Experience with NoSQL technologies, especially HBase and Cassandra
• Have excellent coding skills, specifically in jobs related to ETL and data warehousing
• At least 2 years’ experience with Python, Shell, Java and SQL
• Demonstrated experience with Open Source/Linux development and production environment
• BS degree in Computer Science or related field

Desired skills:
• Experience with cloud computing
• Experience with Scala and Spark
• Experience with programming in R
• Experience working with social media data

We are excited to offer:
• Working with worldwide known companies
• Being part of fun and friendly international team of social media ninjas
• Flexible working hours
• Flat hierarchy and never boring start-up culture

Our mission to help society better understand job seeking behaviors of the global workforce, one human being at a time, exists because of our global vision for a more transparent employment market and for closing the gap to full employment.

Are you ready to join our team? Send us an email with your resume and cover letter attached to stewart@joberate.com.

WWW: https://express.candarine.com/campai...d/06c792aa8039

GroomHost.com : Dedicated IPs - cPanel - 10TB Bandwidth - HDD 250GB - 2 GB RAM - (US)

$
0
0
GroomHost.com - Provides Perfect Money Hosting & Webmoney Shared, Reseller, VPS, Dedicated web hosting and SSL


GroomHost Web Hosting – Providing you the quality service and support. Our plans are competitive and well priced. We have 24/7 support. Also, there's our money back guarantee.



Account Features

• FREE instant Setup
• Instant Account Activation
• 24/7 Technical Support
• 30 Days Money Back Guarantee
• LiteSpeed Web Server
• CloudLinux Maximum Performance & Reliability
• Latest cPanel with Softaculous
• Latest PHP 5.2x,5.3x Perl, CGI & MySQL
• RVSiteBuilder Pro
• FREE Website Migrations
• And much more!




We Accept:

• Payza(AlertPay)
• Web Money
• Perfect Money
• Money Gram
• Western Union
• Skrill(Moneybookers)
• Bitcoin.
• 2CO(Paypal & Credit Card )
• PayPal




Dedicated Server Packages


==================================================
Starter: $200/Month
==================================================

Dual Core
RAM 2048 MB
Bandwidth 10 X 1000 GB
Hard Disk 250 GB
Dedicated IPs 1
Server Location United States
Best for starter teams

ORDER NOW




==================================================
Plus+: $500/Month
==================================================

Intel Core i7
RAM 16 GB RAM
Bandwidth 10 X 1000 GB
Hard Disk 2 X 1 TB
Dedicated IPs 1
Server Location United States
Best for Business

ORDER NOW



==================================================
Enterprise: $300/Month
==================================================


Quad Core
RAM 4096 MB
Bandwidth 10 X 1000 GB
Hard Disk 1 TB
Dedicated IPs 1
Server Location United States
Best for Enterprise

ORDER NOW




30 Days Money Back Guarantee
If we fail to satisfy you we believe it's our fault you can get refund with in 30 days without any question.

99% Uptime Guarantee
We work hard to maintain servers uptime but there are many unpredictable threats such as DDoS attacks, hardware issues, natural disasters that can effect our datacenter, so we guarantee 99% uptime.

24/7 Support
We have a team of support staff waiting to deal with support tickets. If you ever feel you need a little extra help feel free to send us a support ticket. We can guarantee a quick resolution to any problems you encounter.

Migration From Another Host
No worries! We've got you covered. After signup just fill out our transfer form and we'll transfer over your existing cPanel account from your current host with all data intact and no website downtime usually within a few hours.


For any query email us: support[at]GroomHost.com
Like us on Facebook fb.com/GroomHost
Follow us on Twitter@GroomHost

Inconsistent Results From Scripting

$
0
0
I have been hesitant because of the expense of the program but I was finally ALMOST poised to actually purchase the Ultimate version of Brilliant Database as I love the idea of being able to make stand-alone versions of a program! Granted, I, unlike others here, have dealt with some very questionable results from the Brilliant Database product (i.e., queries that did not yield the proper results as well as other outputs that just did not seem right… well, if the truth be told, they were not right). Please know that these were not one-time occurrences but sporadic results which actually troubles me much more.

My latest experience was over this past weekend and I ran a report that took several hours to complete. Upon seeing the results (which appeared to be accurate and correct), I was forced to rerun the identical report because of the known bug with naming conventions and optimizing (yes, I have learned to save the program prior to doing anything ‘major’ in the event that I have to start over). When I saw the new results, I knew immediately that they were vastly different… and incorrect – this was using the same, original file with ABSOLUTELY no changes. I ran the lengthy report again and got incorrect results yet a third time. I closed the file, re-opened the earlier 'good' file and ran it a fourth time this morning only to get totally different results. I have seen too many instances with Brilliant Database being unstable, inconsistent, and providing invalid results.

Although everyone programs differently, my scripting is sound and valid though I repeatedly see that the results are not always repeatable. This troubles me greatly as my project this deals with physicians credentialing and it MUST be accurate. I know that I can create a program in EXCEL with Visual Basic for Applications that runs considerably quicker and yields consistent and proper results but I do not want to start at the beginning simply because of the major shortcomings of Brilliant Database.

This program may be ideal for capturing some basic readings; however, it is my experience that it is not suited for doing any sort of complex analysis. Couple this shortcoming to the cost of the program and TOTAL LACK of customer support and it is not a wise investment for me. It is troubling that in the rare event that you actually get a reply when you send a notification of a bug to Brilliant Database, and it is acknowledged as a bug, and yet the bug continues to exist - this is not good for any organization. Some may state that they have not experienced any issues but I have had many!... too many!!!

I am in a terrible bind here as I have been contacted for using this program again and, because of the critical nature of these certifications, I have no confidence in the validity of the results as they appear to change from run to run with NO changes in data in Brilliant Database. I do not have the time to start anew and recreate a workable program in EXCEL. I am troubled because the physicians training program has no backup plan – just a lot of data with no viable program to analyze the information accurately. If anyone has any suggestions for a programmable database that actually yields accurate information and is affordable, please let me know because I have reached my utmost frustration level with this program.

I would love to be an advocate for Brilliant Database but I cannot because it simply does not always work. Easy stuff (i.e., grabbing data and sorting) it appears to wok fine but with more complex analysis it fails miserably. I am quite bitter towards the Brilliant Database product right now...

Decrypt Data

$
0
0
I have encrypted data in a MySQL Database. I have created a linked server in SQL Server 2012 and I need to decrypt some of the data. I have the key but I am unsure how to begin decrypting. Any and all assistance is greatly appreciated.

Multiple output on one line

$
0
0
I have a mysql database that has three tables - matters, mattersjuncstaff and staff.

The matters table has fields matterid, mattername, refno
The mattersjuncstaff table has fields junked, matterid, staffid, lead
The staff table has staffid, staffname

A matter may have a number of staff associated with it and a number of those staff will be marked as ‘leads’ i.e. they will have a ‘Y’ in the ‘lead’ field.

I wish to show a table that has a list of matters, the matter name and ref no and those staff marked as leads, ideally in a single row. So it would look something like:

reference | mattername | Lead Staff |
ABC1 | matter abc & Co | Fred Smith, Jane Doe, Naomi Watts |

etc

I am using the code below but this only displays one person with the lead field marked Y.

SELECT `refno`, mattername, matters.matterid, staffname FROM matters INNER JOIN matterjuncstaff USING (matterid) Inner join staff using (staffid) Inner join matterjuncactions On matterjuncactions.matterid = matters.matterid WHERE lead = 'Y' GROUP BY matters.matterid, nickname

Can anyone tell me how I can I get round this?

Issue with requery combo

$
0
0
I created a not_in_list event procedure using vba that inserts the data into table then opens form with that record to allow additional data to be added for that record. My problem when I close the form used for edit I have placed vba on close event of the dit form to requery the combo of the original open form where the not_in_list event originated. The issue is it does not refresh allowing the use of the newly created record. I have included vba code for both as review.

not_in_list event vba

Code:

Private Sub cbo_ID_Part_NotInList(NewData As String, Response As Integer)
    ' Prompt user to verify data input is correct.
    Answer = MsgBox("Do you want to insert " & NewData & " into the Part Number list?", vbYesNo, "Part Number not found!")
    ' Answer yes = insert data into data table
    If Answer = vbYes Then
        Part = NewData
            SQLStmt = "Insert into tbl_Part(Part) values ('" & Part & "')"
            DoCmd.RunSQL SQLStmt
            Response = acDataErrAdded
            ' After data insert open form to add description not on this form
            DoCmd.OpenForm "frm_Part"
            DoCmd.GoToRecord , , acLast
                Forms!frm_Part.Part.SetFocus
                Forms!frm_Part.[Part] = Part
        End If

End Sub

Form Close event vba

Code:

Private Sub Form_Close()
Forms!frm_PurchasesMain!frm_Purchases!frm_PurchasesDetails.cbo_ID_Part.Requery
End Sub

Combo box updates but source table populates duplicate data with Unique set on column

$
0
0
Hi Everyone,

Forms involved in Problem
  • Main Form - [Prospecting]
  • Field in Question - SubDivision = Combobox with RowSource from Table [SubDivisions]
  • On [Prospecting] Form [NewSubDivision] is Opened with OpenForm button wizard on [Prospecting]
  • NewSubdivision created from Table [SubDivisions]
  • Table SubDivisions has columns - SubDivision-Short Text - unique; Township - Short Text; 55+ - Yes/No


The Problem
I am updating a combo (Subdivision) on main form [Prospecting] using VBA to requery the combo (Subdivision) by adding code to the close button on form NewSubDivision after adding new info to it. NewSubDivision is based on a table [SubDivisions].

The problem I am having is that the combo updates fine with the VBA but the table the form is based on also updates with the new data even if the new data happens to be duplicated. The column Subdivision is set to Unique. If I open the form (Subdivision) without the main form open I get the expeted error that the table could not be updated because it would create duplicate data.... then it follows with an error that the main form [Prospecting] could not be found...which is also expected as I was testing the form (Subdivision) to make sure no duplicates were being permitted.

Is there a way to check if data already exists and prevent the requery of the combo-box. I thought that access would stop the table being updated as normal as I intentionally added duplicate data..The Form (NewSubDivision) is being opened by a OpenForm button on the main form.

The VBA I am using to requery the combo-box on the mainform - currently in the OnMouseDown event on the close button on form (NewSubDivision) is as follows -
Forms![Prospecting].Controls![SubDivision].Requery

many thanks for all your help
Darren

Sending EXCEL spread sheet out by Email

$
0
0
I have found this code that I am able to save an Excel Spreadsheet of my database to my Hard drive or External depending where I open my database. I am trying to figure out how to send emails of this spreadsheet to seven other people. I have a table with the email addresses called 'Email' . I looked at several answers on this forum but it talks about PDF and going through record sets so I was confused. Can I modify this code or do I have to start out completely new? Thank you.

Code:

Private Sub btnExport_Click()
Dim curPath As String
    Dim xlApp As Object

        curPath = CurrentProject.Path & "\Student - " & Format(Date, "mm-dd-yyyy")
        DoCmd.TransferSpreadsheet acExport, 10, "Student", curPath, -1

        Set xlApp = CreateObject("Excel.Application")
        xlApp.Workbooks.Open (curPath)
        xlApp.Visible = True
End Sub

How should I model this relationship?

$
0
0
I think this should be easy, but it's got me completely stumped.

I've got a Movies table and I'd like to be able to track the Writer(s), Director(s), and Actor(s) who are in that movie. I know that a Person can have more than one Job in a Movie.

I know all of that, but I have absolutely no idea what the relationship should look like.

I have this:
Untitled.png

But, I don't even know if I'm headed in the right direction.
Attached Images

Help in creating an update query

$
0
0
Hello,
I need to create an update query to replace skus in one table if they match skus from another table. Say I have table "A" i created with 2 fields SKU and AltSKU. I need to create an update query that updates SKUs from table "B" if they match a sku from table "A". If the sku in table "B" matches a sku in table "A", then the sku in table "B" will be replaced by the AltSKU from table "A". I hope i explained that right. Thanks!

Loading .dat file into sql developer

$
0
0
what is the convenient way to load a .dat file into a table thru SQL developer? please help.
I dont have sql loader.

If I open the .dat file in excel and convert into .csv file and try importing the file, it would take atleast one hour to map the columns to the cols of data as the table holds 500 columns. Is there any other way? please suggest.

Hello DB Folks !

$
0
0
Hi all, i think, everybody is very well. I am new in this forum. I came here for learning new things about web Database. I think, everyone will help me for getting some juice. :D

optimizer question

$
0
0
strange behavior and I don't see why
db2 ese 10.1 fp4 on p/linux
simple table (not partitioned) with different indexes - about 3m rows
indexes on these cols :
+QUAL+TYPE+STEP+STATUS
+LINK
+STEP+STATUS
+SEQ
+FORWARDLINK
query with left outer join to same table and predicates : result =300rows
select .....colnames.....
from biip.MailboxStatus batchfilei0_ left outer join
biip.MailboxStatus batchfilei1_ on batchfilei0_.link= batchfilei1_.seq
where batchfilei0_.step=4 and (batchfilei0_.status<4 or
batchfilei0_.status=5)or batchfilei0_.step<4
according db2expln : tablescan
Access Table Name = BIIP.MAILBOXSTATUS ID = 14,10
| Relation Scan
db2advis indicates no indexes recommended
offline reorg-runstats have been done but still tablescan
would there be any reason, why db2 would not filter the table before as the predicates reduce the answer-set to 300rows and only these rows would have a potential equivalent in the join..

function return a row in a integer array

$
0
0
Hi,
I want to write a function in PostGre that returns a row (columns are int type) in a integer array .
Can anybody help me .. new in PGre Concepts !!

Scene::
Tabl :: col1(int) col2(int) col3(int)

What I try like:
declare val int[];
begin
val := '{}';
select array(select col1 || ',' || col2 || ',' || col3 from tabl ) into val;
return val;
end;

Cant get the desired result though ..
Want val{1,1,1} something like this

Thanks in advance ...

Joberate seeks Data Engineers to join their team in USA!

$
0
0
Joberate is an early stage startup in the HR Technology predictive analytics space. We’ve developed a disruptive approach to quantifying, measuring, and scoring job-seeking behavior of the global workforce.

Today Joberate technology helps corporate and socially minded companies gain actionable people insights by tapping the digital footprint of the global workforce, to help companies reduce employee attrition, retain and engage their valuable employees, improve workforce planning, and gain a competitive advantage in recruitment.

Joberate is looking for an experienced data engineer who is passionate about working with data that will have a significant global impact. In this position, you will have the opportunity to work all kinds of data to build the cutting edge, Big Data platform for people analytics.

Responsibilities:
• Design, build, and support a new data platform
• Create and support various ETL scripts, jobs, and technologies
• Support APIs, tools, and 3rd party products to extract data
• Tune and optimize data storage technologies
• Provide ongoing data quality monitoring and support

Basic qualifications:
• Experience with NoSQL technologies, especially HBase and Cassandra
• Have excellent coding skills, specifically in jobs related to ETL and data warehousing
• At least 2 years’ experience with Python, Shell, Java and SQL
• Demonstrated experience with Open Source/Linux development and production environment
• BS degree in Computer Science or related field

Desired skills:
• Experience with cloud computing
• Experience with Scala and Spark
• Experience with programming in R
• Experience working with social media data

We are excited to offer:
• Remote working in the continental United States (Boston area preferred, but not required)
• Working with worldwide known companies
• Being part of fun and friendly international team of social media ninjas
• Flexible working hours
• Flat hierarchy and never boring start-up culture

Our mission to help society better understand job seeking behaviors of the global workforce, one human being at a time, exists because of our global vision for a more transparent employment market and for closing the gap to full employment.

Are you ready to join our team? Send us an email with your resume and cover letter attached to stewart@joberate.com.

www: https://express.candarine.com/campai...d/99999b6232f4
Viewing all 13329 articles
Browse latest View live